Sep 4 17:49:56.814729 kernel: Linux version 6.12.44-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Sep 4 13:44:59 -00 2025 Sep 4 17:49:56.814760 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 17:49:56.814772 kernel: BIOS-provided physical RAM map: Sep 4 17:49:56.814779 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 4 17:49:56.814785 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 4 17:49:56.814792 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 4 17:49:56.814799 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Sep 4 17:49:56.814812 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 4 17:49:56.814822 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 4 17:49:56.814832 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 4 17:49:56.814838 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable Sep 4 17:49:56.814845 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 4 17:49:56.814851 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 4 17:49:56.814858 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 4 17:49:56.814866 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 4 17:49:56.814876 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 4 17:49:56.814885 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 4 17:49:56.814893 kernel: BIOS-e820: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 4 17:49:56.814900 kernel: BIOS-e820: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 4 17:49:56.814907 kernel: BIOS-e820: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 4 17:49:56.814914 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 4 17:49:56.814921 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 4 17:49:56.814928 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 4 17:49:56.814935 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 4 17:49:56.814942 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 4 17:49:56.814951 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 4 17:49:56.814958 kernel: NX (Execute Disable) protection: active Sep 4 17:49:56.814965 kernel: APIC: Static calls initialized Sep 4 17:49:56.814973 kernel: e820: update [mem 0x9b320018-0x9b329c57] usable ==> usable Sep 4 17:49:56.814980 kernel: e820: update [mem 0x9b2e3018-0x9b31fe57] usable ==> usable Sep 4 17:49:56.814987 kernel: extended physical RAM map: Sep 4 17:49:56.814994 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Sep 4 17:49:56.815001 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Sep 4 17:49:56.815008 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Sep 4 17:49:56.815015 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Sep 4 17:49:56.815023 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Sep 4 17:49:56.815032 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Sep 4 17:49:56.815039 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Sep 4 17:49:56.815059 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b2e3017] usable Sep 4 17:49:56.815067 kernel: reserve setup_data: [mem 0x000000009b2e3018-0x000000009b31fe57] usable Sep 4 17:49:56.815078 kernel: reserve setup_data: [mem 0x000000009b31fe58-0x000000009b320017] usable Sep 4 17:49:56.815086 kernel: reserve setup_data: [mem 0x000000009b320018-0x000000009b329c57] usable Sep 4 17:49:56.815095 kernel: reserve setup_data: [mem 0x000000009b329c58-0x000000009bd3efff] usable Sep 4 17:49:56.815102 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Sep 4 17:49:56.815110 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable Sep 4 17:49:56.815117 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Sep 4 17:49:56.815125 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Sep 4 17:49:56.815132 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Sep 4 17:49:56.815139 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce90fff] usable Sep 4 17:49:56.815147 kernel: reserve setup_data: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Sep 4 17:49:56.815154 kernel: reserve setup_data: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Sep 4 17:49:56.815161 kernel: reserve setup_data: [mem 0x000000009ce97000-0x000000009cedbfff] usable Sep 4 17:49:56.815171 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Sep 4 17:49:56.815179 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Sep 4 17:49:56.815186 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 4 17:49:56.815193 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 4 17:49:56.815201 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Sep 4 17:49:56.815208 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Sep 4 17:49:56.815218 kernel: efi: EFI v2.7 by EDK II Sep 4 17:49:56.815226 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b9e4198 RNG=0x9cb73018 Sep 4 17:49:56.815233 kernel: random: crng init done Sep 4 17:49:56.815243 kernel: efi: Remove mem151: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Sep 4 17:49:56.815250 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Sep 4 17:49:56.815262 kernel: secureboot: Secure boot disabled Sep 4 17:49:56.815270 kernel: SMBIOS 2.8 present. Sep 4 17:49:56.815277 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Sep 4 17:49:56.815284 kernel: DMI: Memory slots populated: 1/1 Sep 4 17:49:56.815291 kernel: Hypervisor detected: KVM Sep 4 17:49:56.815299 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 4 17:49:56.815306 kernel: kvm-clock: using sched offset of 4750433005 cycles Sep 4 17:49:56.815314 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 4 17:49:56.815322 kernel: tsc: Detected 2794.748 MHz processor Sep 4 17:49:56.815329 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 4 17:49:56.815337 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 4 17:49:56.815347 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 Sep 4 17:49:56.815355 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Sep 4 17:49:56.815363 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 4 17:49:56.815370 kernel: Using GB pages for direct mapping Sep 4 17:49:56.815378 kernel: ACPI: Early table checksum verification disabled Sep 4 17:49:56.815386 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) Sep 4 17:49:56.815393 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Sep 4 17:49:56.815401 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 17:49:56.815409 kernel: ACPI: DSDT 0x000000009CB7A000 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 17:49:56.815419 kernel: ACPI: FACS 0x000000009CBDD000 000040 Sep 4 17:49:56.815426 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 17:49:56.815434 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 17:49:56.815441 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 17:49:56.815449 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 4 17:49:56.815457 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) Sep 4 17:49:56.815464 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] Sep 4 17:49:56.815472 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1b9] Sep 4 17:49:56.815479 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] Sep 4 17:49:56.815489 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] Sep 4 17:49:56.815496 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] Sep 4 17:49:56.815504 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] Sep 4 17:49:56.815511 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] Sep 4 17:49:56.815519 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] Sep 4 17:49:56.815526 kernel: No NUMA configuration found Sep 4 17:49:56.815534 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] Sep 4 17:49:56.815541 kernel: NODE_DATA(0) allocated [mem 0x9ce36dc0-0x9ce3dfff] Sep 4 17:49:56.815549 kernel: Zone ranges: Sep 4 17:49:56.815559 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 4 17:49:56.815573 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] Sep 4 17:49:56.815580 kernel: Normal empty Sep 4 17:49:56.815588 kernel: Device empty Sep 4 17:49:56.815595 kernel: Movable zone start for each node Sep 4 17:49:56.815603 kernel: Early memory node ranges Sep 4 17:49:56.815611 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Sep 4 17:49:56.815618 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Sep 4 17:49:56.815628 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Sep 4 17:49:56.815636 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Sep 4 17:49:56.815646 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] Sep 4 17:49:56.815654 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] Sep 4 17:49:56.815661 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce90fff] Sep 4 17:49:56.815669 kernel: node 0: [mem 0x000000009ce97000-0x000000009cedbfff] Sep 4 17:49:56.815676 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] Sep 4 17:49:56.815684 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 4 17:49:56.815693 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Sep 4 17:49:56.815711 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Sep 4 17:49:56.815720 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 4 17:49:56.815727 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Sep 4 17:49:56.815735 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Sep 4 17:49:56.815743 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Sep 4 17:49:56.815754 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Sep 4 17:49:56.815762 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges Sep 4 17:49:56.815770 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 4 17:49:56.815778 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 4 17:49:56.815786 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 4 17:49:56.815796 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 4 17:49:56.815804 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 4 17:49:56.815812 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 4 17:49:56.815820 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 4 17:49:56.815827 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 4 17:49:56.815835 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 4 17:49:56.815843 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 4 17:49:56.815851 kernel: TSC deadline timer available Sep 4 17:49:56.815859 kernel: CPU topo: Max. logical packages: 1 Sep 4 17:49:56.815869 kernel: CPU topo: Max. logical dies: 1 Sep 4 17:49:56.815877 kernel: CPU topo: Max. dies per package: 1 Sep 4 17:49:56.815885 kernel: CPU topo: Max. threads per core: 1 Sep 4 17:49:56.815893 kernel: CPU topo: Num. cores per package: 4 Sep 4 17:49:56.815901 kernel: CPU topo: Num. threads per package: 4 Sep 4 17:49:56.815909 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Sep 4 17:49:56.815916 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 4 17:49:56.815924 kernel: kvm-guest: KVM setup pv remote TLB flush Sep 4 17:49:56.815932 kernel: kvm-guest: setup PV sched yield Sep 4 17:49:56.815943 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices Sep 4 17:49:56.815950 kernel: Booting paravirtualized kernel on KVM Sep 4 17:49:56.815959 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 4 17:49:56.815967 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Sep 4 17:49:56.815975 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Sep 4 17:49:56.815982 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Sep 4 17:49:56.815990 kernel: pcpu-alloc: [0] 0 1 2 3 Sep 4 17:49:56.815998 kernel: kvm-guest: PV spinlocks enabled Sep 4 17:49:56.816006 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 4 17:49:56.816017 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 17:49:56.816029 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 4 17:49:56.816037 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 4 17:49:56.816059 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 4 17:49:56.816068 kernel: Fallback order for Node 0: 0 Sep 4 17:49:56.816086 kernel: Built 1 zonelists, mobility grouping on. Total pages: 641450 Sep 4 17:49:56.816104 kernel: Policy zone: DMA32 Sep 4 17:49:56.816112 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 4 17:49:56.816124 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Sep 4 17:49:56.816132 kernel: ftrace: allocating 40102 entries in 157 pages Sep 4 17:49:56.816140 kernel: ftrace: allocated 157 pages with 5 groups Sep 4 17:49:56.816148 kernel: Dynamic Preempt: voluntary Sep 4 17:49:56.816156 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 4 17:49:56.816170 kernel: rcu: RCU event tracing is enabled. Sep 4 17:49:56.816178 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Sep 4 17:49:56.816186 kernel: Trampoline variant of Tasks RCU enabled. Sep 4 17:49:56.816194 kernel: Rude variant of Tasks RCU enabled. Sep 4 17:49:56.816205 kernel: Tracing variant of Tasks RCU enabled. Sep 4 17:49:56.816213 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 4 17:49:56.816224 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Sep 4 17:49:56.816232 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 4 17:49:56.816240 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 4 17:49:56.816248 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Sep 4 17:49:56.816256 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Sep 4 17:49:56.816264 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 4 17:49:56.816271 kernel: Console: colour dummy device 80x25 Sep 4 17:49:56.816282 kernel: printk: legacy console [ttyS0] enabled Sep 4 17:49:56.816290 kernel: ACPI: Core revision 20240827 Sep 4 17:49:56.816298 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 4 17:49:56.816305 kernel: APIC: Switch to symmetric I/O mode setup Sep 4 17:49:56.816313 kernel: x2apic enabled Sep 4 17:49:56.816321 kernel: APIC: Switched APIC routing to: physical x2apic Sep 4 17:49:56.816329 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Sep 4 17:49:56.816337 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Sep 4 17:49:56.816345 kernel: kvm-guest: setup PV IPIs Sep 4 17:49:56.816356 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 4 17:49:56.816364 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 4 17:49:56.816372 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) Sep 4 17:49:56.816380 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 4 17:49:56.816388 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 4 17:49:56.816395 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 4 17:49:56.816404 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 4 17:49:56.816411 kernel: Spectre V2 : Mitigation: Retpolines Sep 4 17:49:56.816419 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 4 17:49:56.816430 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 4 17:49:56.816438 kernel: active return thunk: retbleed_return_thunk Sep 4 17:49:56.816446 kernel: RETBleed: Mitigation: untrained return thunk Sep 4 17:49:56.816456 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 4 17:49:56.816464 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 4 17:49:56.816472 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Sep 4 17:49:56.816481 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Sep 4 17:49:56.816489 kernel: active return thunk: srso_return_thunk Sep 4 17:49:56.816499 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Sep 4 17:49:56.816507 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 4 17:49:56.816515 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 4 17:49:56.816523 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 4 17:49:56.816531 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 4 17:49:56.816539 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 4 17:49:56.816547 kernel: Freeing SMP alternatives memory: 32K Sep 4 17:49:56.816555 kernel: pid_max: default: 32768 minimum: 301 Sep 4 17:49:56.816570 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 4 17:49:56.816580 kernel: landlock: Up and running. Sep 4 17:49:56.816588 kernel: SELinux: Initializing. Sep 4 17:49:56.816596 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 4 17:49:56.816604 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 4 17:49:56.816613 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 4 17:49:56.816621 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 4 17:49:56.816629 kernel: ... version: 0 Sep 4 17:49:56.816636 kernel: ... bit width: 48 Sep 4 17:49:56.816644 kernel: ... generic registers: 6 Sep 4 17:49:56.816655 kernel: ... value mask: 0000ffffffffffff Sep 4 17:49:56.816662 kernel: ... max period: 00007fffffffffff Sep 4 17:49:56.816670 kernel: ... fixed-purpose events: 0 Sep 4 17:49:56.816678 kernel: ... event mask: 000000000000003f Sep 4 17:49:56.816686 kernel: signal: max sigframe size: 1776 Sep 4 17:49:56.816694 kernel: rcu: Hierarchical SRCU implementation. Sep 4 17:49:56.816702 kernel: rcu: Max phase no-delay instances is 400. Sep 4 17:49:56.816712 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 4 17:49:56.816720 kernel: smp: Bringing up secondary CPUs ... Sep 4 17:49:56.816730 kernel: smpboot: x86: Booting SMP configuration: Sep 4 17:49:56.816738 kernel: .... node #0, CPUs: #1 #2 #3 Sep 4 17:49:56.816746 kernel: smp: Brought up 1 node, 4 CPUs Sep 4 17:49:56.816754 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) Sep 4 17:49:56.816762 kernel: Memory: 2422672K/2565800K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54076K init, 2892K bss, 137196K reserved, 0K cma-reserved) Sep 4 17:49:56.816770 kernel: devtmpfs: initialized Sep 4 17:49:56.816778 kernel: x86/mm: Memory block size: 128MB Sep 4 17:49:56.816786 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Sep 4 17:49:56.816794 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Sep 4 17:49:56.816805 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Sep 4 17:49:56.816813 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) Sep 4 17:49:56.816821 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce95000-0x9ce96fff] (8192 bytes) Sep 4 17:49:56.816829 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) Sep 4 17:49:56.816837 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 4 17:49:56.816845 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Sep 4 17:49:56.816853 kernel: pinctrl core: initialized pinctrl subsystem Sep 4 17:49:56.816861 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 4 17:49:56.816869 kernel: audit: initializing netlink subsys (disabled) Sep 4 17:49:56.816878 kernel: audit: type=2000 audit(1757008194.807:1): state=initialized audit_enabled=0 res=1 Sep 4 17:49:56.816886 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 4 17:49:56.816894 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 4 17:49:56.816902 kernel: cpuidle: using governor menu Sep 4 17:49:56.816910 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 4 17:49:56.816918 kernel: dca service started, version 1.12.1 Sep 4 17:49:56.816926 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 4 17:49:56.816934 kernel: PCI: Using configuration type 1 for base access Sep 4 17:49:56.816942 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 4 17:49:56.816951 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 4 17:49:56.816959 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 4 17:49:56.816967 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 4 17:49:56.816975 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 4 17:49:56.816983 kernel: ACPI: Added _OSI(Module Device) Sep 4 17:49:56.816991 kernel: ACPI: Added _OSI(Processor Device) Sep 4 17:49:56.816999 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 4 17:49:56.817007 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 4 17:49:56.817015 kernel: ACPI: Interpreter enabled Sep 4 17:49:56.817024 kernel: ACPI: PM: (supports S0 S3 S5) Sep 4 17:49:56.817032 kernel: ACPI: Using IOAPIC for interrupt routing Sep 4 17:49:56.817040 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 4 17:49:56.817061 kernel: PCI: Using E820 reservations for host bridge windows Sep 4 17:49:56.817069 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 4 17:49:56.817077 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 4 17:49:56.817323 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 4 17:49:56.817451 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 4 17:49:56.817586 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 4 17:49:56.817597 kernel: PCI host bridge to bus 0000:00 Sep 4 17:49:56.817740 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 4 17:49:56.817853 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 4 17:49:56.817961 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 4 17:49:56.819210 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] Sep 4 17:49:56.819329 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Sep 4 17:49:56.819444 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] Sep 4 17:49:56.819553 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 4 17:49:56.819722 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 4 17:49:56.819865 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Sep 4 17:49:56.819990 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] Sep 4 17:49:56.820134 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] Sep 4 17:49:56.820274 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Sep 4 17:49:56.820397 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 4 17:49:56.820538 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 4 17:49:56.821914 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] Sep 4 17:49:56.822065 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] Sep 4 17:49:56.822191 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] Sep 4 17:49:56.822333 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Sep 4 17:49:56.822464 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] Sep 4 17:49:56.822595 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] Sep 4 17:49:56.822721 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] Sep 4 17:49:56.822886 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 4 17:49:56.823009 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] Sep 4 17:49:56.823149 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] Sep 4 17:49:56.823274 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] Sep 4 17:49:56.823392 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] Sep 4 17:49:56.823529 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 4 17:49:56.823670 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 4 17:49:56.823811 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 4 17:49:56.824997 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] Sep 4 17:49:56.825136 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] Sep 4 17:49:56.825280 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 4 17:49:56.825410 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] Sep 4 17:49:56.825420 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 4 17:49:56.825429 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 4 17:49:56.825437 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 4 17:49:56.825445 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 4 17:49:56.825453 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 4 17:49:56.825460 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 4 17:49:56.825472 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 4 17:49:56.825479 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 4 17:49:56.825487 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 4 17:49:56.825495 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 4 17:49:56.825503 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 4 17:49:56.825511 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 4 17:49:56.825519 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 4 17:49:56.825527 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 4 17:49:56.825535 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 4 17:49:56.825544 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 4 17:49:56.825552 kernel: iommu: Default domain type: Translated Sep 4 17:49:56.825560 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 4 17:49:56.825577 kernel: efivars: Registered efivars operations Sep 4 17:49:56.825585 kernel: PCI: Using ACPI for IRQ routing Sep 4 17:49:56.825593 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 4 17:49:56.825601 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Sep 4 17:49:56.825608 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Sep 4 17:49:56.825616 kernel: e820: reserve RAM buffer [mem 0x9b2e3018-0x9bffffff] Sep 4 17:49:56.825627 kernel: e820: reserve RAM buffer [mem 0x9b320018-0x9bffffff] Sep 4 17:49:56.825635 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] Sep 4 17:49:56.825643 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] Sep 4 17:49:56.825650 kernel: e820: reserve RAM buffer [mem 0x9ce91000-0x9fffffff] Sep 4 17:49:56.825659 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] Sep 4 17:49:56.825783 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 4 17:49:56.825911 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 4 17:49:56.826036 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 4 17:49:56.826066 kernel: vgaarb: loaded Sep 4 17:49:56.826074 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 4 17:49:56.826082 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 4 17:49:56.826090 kernel: clocksource: Switched to clocksource kvm-clock Sep 4 17:49:56.826098 kernel: VFS: Disk quotas dquot_6.6.0 Sep 4 17:49:56.826106 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 4 17:49:56.826115 kernel: pnp: PnP ACPI init Sep 4 17:49:56.826285 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved Sep 4 17:49:56.826304 kernel: pnp: PnP ACPI: found 6 devices Sep 4 17:49:56.826312 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 4 17:49:56.826321 kernel: NET: Registered PF_INET protocol family Sep 4 17:49:56.826329 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 4 17:49:56.826338 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 4 17:49:56.826346 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 4 17:49:56.826354 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 4 17:49:56.826363 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 4 17:49:56.826371 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 4 17:49:56.826381 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 4 17:49:56.826390 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 4 17:49:56.826398 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 4 17:49:56.826406 kernel: NET: Registered PF_XDP protocol family Sep 4 17:49:56.826530 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window Sep 4 17:49:56.826664 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned Sep 4 17:49:56.826779 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 4 17:49:56.826888 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 4 17:49:56.827001 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 4 17:49:56.827131 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] Sep 4 17:49:56.827251 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Sep 4 17:49:56.827363 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] Sep 4 17:49:56.827373 kernel: PCI: CLS 0 bytes, default 64 Sep 4 17:49:56.827382 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848df6a9de, max_idle_ns: 440795280912 ns Sep 4 17:49:56.827391 kernel: Initialise system trusted keyrings Sep 4 17:49:56.827403 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 4 17:49:56.827414 kernel: Key type asymmetric registered Sep 4 17:49:56.827422 kernel: Asymmetric key parser 'x509' registered Sep 4 17:49:56.827430 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 4 17:49:56.827439 kernel: io scheduler mq-deadline registered Sep 4 17:49:56.827447 kernel: io scheduler kyber registered Sep 4 17:49:56.827455 kernel: io scheduler bfq registered Sep 4 17:49:56.827465 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 4 17:49:56.828585 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 4 17:49:56.828628 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 4 17:49:56.828638 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Sep 4 17:49:56.828647 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 4 17:49:56.828657 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 4 17:49:56.828666 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 4 17:49:56.828674 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 4 17:49:56.828683 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 4 17:49:56.828905 kernel: rtc_cmos 00:04: RTC can wake from S4 Sep 4 17:49:56.829036 kernel: rtc_cmos 00:04: registered as rtc0 Sep 4 17:49:56.829204 kernel: rtc_cmos 00:04: setting system clock to 2025-09-04T17:49:56 UTC (1757008196) Sep 4 17:49:56.829320 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Sep 4 17:49:56.829332 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 4 17:49:56.829341 kernel: efifb: probing for efifb Sep 4 17:49:56.829350 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 4 17:49:56.829358 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k Sep 4 17:49:56.829373 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Sep 4 17:49:56.829381 kernel: efifb: scrolling: redraw Sep 4 17:49:56.829390 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Sep 4 17:49:56.829398 kernel: Console: switching to colour frame buffer device 160x50 Sep 4 17:49:56.829407 kernel: fb0: EFI VGA frame buffer device Sep 4 17:49:56.829416 kernel: pstore: Using crash dump compression: deflate Sep 4 17:49:56.829424 kernel: pstore: Registered efi_pstore as persistent store backend Sep 4 17:49:56.829433 kernel: NET: Registered PF_INET6 protocol family Sep 4 17:49:56.829441 kernel: Segment Routing with IPv6 Sep 4 17:49:56.829453 kernel: In-situ OAM (IOAM) with IPv6 Sep 4 17:49:56.829462 kernel: NET: Registered PF_PACKET protocol family Sep 4 17:49:56.829470 kernel: Key type dns_resolver registered Sep 4 17:49:56.829479 kernel: IPI shorthand broadcast: enabled Sep 4 17:49:56.829488 kernel: sched_clock: Marking stable (3366002250, 151538607)->(3534853464, -17312607) Sep 4 17:49:56.829496 kernel: registered taskstats version 1 Sep 4 17:49:56.829507 kernel: Loading compiled-in X.509 certificates Sep 4 17:49:56.829519 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.44-flatcar: 1106dff6b31a2cb943a47c73d0d8dff07e2a7490' Sep 4 17:49:56.829529 kernel: Demotion targets for Node 0: null Sep 4 17:49:56.829543 kernel: Key type .fscrypt registered Sep 4 17:49:56.829554 kernel: Key type fscrypt-provisioning registered Sep 4 17:49:56.829573 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 4 17:49:56.829582 kernel: ima: Allocated hash algorithm: sha1 Sep 4 17:49:56.829590 kernel: ima: No architecture policies found Sep 4 17:49:56.829598 kernel: clk: Disabling unused clocks Sep 4 17:49:56.829607 kernel: Warning: unable to open an initial console. Sep 4 17:49:56.829616 kernel: Freeing unused kernel image (initmem) memory: 54076K Sep 4 17:49:56.829624 kernel: Write protecting the kernel read-only data: 24576k Sep 4 17:49:56.829637 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 4 17:49:56.829645 kernel: Run /init as init process Sep 4 17:49:56.829654 kernel: with arguments: Sep 4 17:49:56.829663 kernel: /init Sep 4 17:49:56.829671 kernel: with environment: Sep 4 17:49:56.829679 kernel: HOME=/ Sep 4 17:49:56.829688 kernel: TERM=linux Sep 4 17:49:56.829696 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 4 17:49:56.829705 systemd[1]: Successfully made /usr/ read-only. Sep 4 17:49:56.829722 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 17:49:56.829732 systemd[1]: Detected virtualization kvm. Sep 4 17:49:56.829741 systemd[1]: Detected architecture x86-64. Sep 4 17:49:56.829750 systemd[1]: Running in initrd. Sep 4 17:49:56.829758 systemd[1]: No hostname configured, using default hostname. Sep 4 17:49:56.829767 systemd[1]: Hostname set to . Sep 4 17:49:56.829776 systemd[1]: Initializing machine ID from VM UUID. Sep 4 17:49:56.829788 systemd[1]: Queued start job for default target initrd.target. Sep 4 17:49:56.829800 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 17:49:56.829811 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 17:49:56.829823 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 4 17:49:56.829834 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 17:49:56.829843 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 4 17:49:56.829853 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 4 17:49:56.829866 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 4 17:49:56.829876 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 4 17:49:56.829884 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 17:49:56.829893 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 17:49:56.829902 systemd[1]: Reached target paths.target - Path Units. Sep 4 17:49:56.829911 systemd[1]: Reached target slices.target - Slice Units. Sep 4 17:49:56.829920 systemd[1]: Reached target swap.target - Swaps. Sep 4 17:49:56.829929 systemd[1]: Reached target timers.target - Timer Units. Sep 4 17:49:56.829940 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 17:49:56.829949 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 17:49:56.829958 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 4 17:49:56.829967 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 4 17:49:56.829976 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 17:49:56.829985 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 17:49:56.829994 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 17:49:56.830003 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 17:49:56.830012 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 4 17:49:56.830024 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 17:49:56.830033 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 4 17:49:56.830042 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 4 17:49:56.830072 systemd[1]: Starting systemd-fsck-usr.service... Sep 4 17:49:56.830081 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 17:49:56.830090 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 17:49:56.830099 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 17:49:56.830108 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 4 17:49:56.830121 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 17:49:56.830130 systemd[1]: Finished systemd-fsck-usr.service. Sep 4 17:49:56.830139 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 17:49:56.830176 systemd-journald[218]: Collecting audit messages is disabled. Sep 4 17:49:56.830204 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:49:56.830214 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 17:49:56.830224 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 4 17:49:56.830233 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 17:49:56.830245 systemd-journald[218]: Journal started Sep 4 17:49:56.830265 systemd-journald[218]: Runtime Journal (/run/log/journal/23ce83d2c05e484d99b02de9101999b3) is 6M, max 48.4M, 42.4M free. Sep 4 17:49:56.812653 systemd-modules-load[221]: Inserted module 'overlay' Sep 4 17:49:56.835658 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 17:49:56.840100 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 4 17:49:56.842973 kernel: Bridge firewalling registered Sep 4 17:49:56.841230 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 17:49:56.842769 systemd-modules-load[221]: Inserted module 'br_netfilter' Sep 4 17:49:56.845033 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 17:49:56.845877 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 17:49:56.848773 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 17:49:56.857086 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 17:49:56.857204 systemd-tmpfiles[242]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 4 17:49:56.858182 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 4 17:49:56.862148 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 17:49:56.871228 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 17:49:56.875633 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 17:49:56.886084 dracut-cmdline[257]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=36c924095f449c8931a6685ec70d72df97f8ad57d1c78208ae0ead8cae8f5127 Sep 4 17:49:56.925157 systemd-resolved[264]: Positive Trust Anchors: Sep 4 17:49:56.925176 systemd-resolved[264]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 17:49:56.925206 systemd-resolved[264]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 17:49:56.927724 systemd-resolved[264]: Defaulting to hostname 'linux'. Sep 4 17:49:56.929135 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 17:49:56.934313 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 17:49:57.007096 kernel: SCSI subsystem initialized Sep 4 17:49:57.016096 kernel: Loading iSCSI transport class v2.0-870. Sep 4 17:49:57.027082 kernel: iscsi: registered transport (tcp) Sep 4 17:49:57.050085 kernel: iscsi: registered transport (qla4xxx) Sep 4 17:49:57.050118 kernel: QLogic iSCSI HBA Driver Sep 4 17:49:57.073159 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 17:49:57.090898 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 17:49:57.091704 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 17:49:57.160249 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 4 17:49:57.162312 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 4 17:49:57.222075 kernel: raid6: avx2x4 gen() 30562 MB/s Sep 4 17:49:57.239068 kernel: raid6: avx2x2 gen() 31177 MB/s Sep 4 17:49:57.256108 kernel: raid6: avx2x1 gen() 25943 MB/s Sep 4 17:49:57.256125 kernel: raid6: using algorithm avx2x2 gen() 31177 MB/s Sep 4 17:49:57.274106 kernel: raid6: .... xor() 19927 MB/s, rmw enabled Sep 4 17:49:57.274130 kernel: raid6: using avx2x2 recovery algorithm Sep 4 17:49:57.296079 kernel: xor: automatically using best checksumming function avx Sep 4 17:49:57.465075 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 4 17:49:57.475390 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 4 17:49:57.478488 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 17:49:57.509782 systemd-udevd[472]: Using default interface naming scheme 'v255'. Sep 4 17:49:57.515514 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 17:49:57.516676 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 4 17:49:57.541279 dracut-pre-trigger[474]: rd.md=0: removing MD RAID activation Sep 4 17:49:57.571859 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 17:49:57.574389 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 17:49:57.659719 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 17:49:57.661228 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 4 17:49:57.725163 kernel: cryptd: max_cpu_qlen set to 1000 Sep 4 17:49:57.726111 kernel: libata version 3.00 loaded. Sep 4 17:49:57.729109 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Sep 4 17:49:57.731316 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 17:49:57.734625 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Sep 4 17:49:57.731445 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:49:57.736081 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 17:49:57.741358 kernel: AES CTR mode by8 optimization enabled Sep 4 17:49:57.746645 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 17:49:57.756618 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 4 17:49:57.756641 kernel: GPT:9289727 != 19775487 Sep 4 17:49:57.756651 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 4 17:49:57.756661 kernel: GPT:9289727 != 19775487 Sep 4 17:49:57.756671 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 4 17:49:57.756681 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 17:49:57.763616 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 4 17:49:57.762870 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 17:49:57.762991 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:49:57.767840 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 17:49:57.774223 kernel: ahci 0000:00:1f.2: version 3.0 Sep 4 17:49:57.774491 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 4 17:49:57.777075 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 4 17:49:57.777245 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 4 17:49:57.777387 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 4 17:49:57.787077 kernel: scsi host0: ahci Sep 4 17:49:57.791125 kernel: scsi host1: ahci Sep 4 17:49:57.798119 kernel: scsi host2: ahci Sep 4 17:49:57.799070 kernel: scsi host3: ahci Sep 4 17:49:57.800101 kernel: scsi host4: ahci Sep 4 17:49:57.804084 kernel: scsi host5: ahci Sep 4 17:49:57.804330 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 1 Sep 4 17:49:57.804344 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 1 Sep 4 17:49:57.804363 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 1 Sep 4 17:49:57.802504 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Sep 4 17:49:57.808559 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 1 Sep 4 17:49:57.808574 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 1 Sep 4 17:49:57.808586 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 1 Sep 4 17:49:57.806991 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:49:57.824409 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Sep 4 17:49:57.845734 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 4 17:49:57.852955 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Sep 4 17:49:57.853229 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Sep 4 17:49:57.854439 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 4 17:49:58.118112 kernel: ata1: SATA link down (SStatus 0 SControl 300) Sep 4 17:49:58.124689 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 4 17:49:58.124721 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 4 17:49:58.125062 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 4 17:49:58.126102 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 4 17:49:58.127078 kernel: ata3.00: LPM support broken, forcing max_power Sep 4 17:49:58.127097 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 4 17:49:58.127455 kernel: ata3.00: applying bridge limits Sep 4 17:49:58.128076 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 4 17:49:58.129082 kernel: ata3.00: LPM support broken, forcing max_power Sep 4 17:49:58.130077 kernel: ata3.00: configured for UDMA/100 Sep 4 17:49:58.130094 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 4 17:49:58.185066 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 4 17:49:58.185289 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 4 17:49:58.205893 disk-uuid[634]: Primary Header is updated. Sep 4 17:49:58.205893 disk-uuid[634]: Secondary Entries is updated. Sep 4 17:49:58.205893 disk-uuid[634]: Secondary Header is updated. Sep 4 17:49:58.208966 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Sep 4 17:49:58.209208 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 17:49:58.539000 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 4 17:49:58.540770 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 17:49:58.542297 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 17:49:58.543445 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 17:49:58.544721 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 4 17:49:58.569710 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 4 17:49:59.217716 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Sep 4 17:49:59.217779 disk-uuid[635]: The operation has completed successfully. Sep 4 17:49:59.256859 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 4 17:49:59.256987 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 4 17:49:59.290434 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 4 17:49:59.315849 sh[663]: Success Sep 4 17:49:59.335918 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 4 17:49:59.335985 kernel: device-mapper: uevent: version 1.0.3 Sep 4 17:49:59.336013 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 4 17:49:59.347147 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 4 17:49:59.379878 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 4 17:49:59.384900 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 4 17:49:59.399091 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 4 17:49:59.405070 kernel: BTRFS: device fsid 03d586f6-54f4-4e78-a040-c693154b15e4 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (675) Sep 4 17:49:59.407105 kernel: BTRFS info (device dm-0): first mount of filesystem 03d586f6-54f4-4e78-a040-c693154b15e4 Sep 4 17:49:59.407129 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 4 17:49:59.411308 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 4 17:49:59.411327 kernel: BTRFS info (device dm-0): enabling free space tree Sep 4 17:49:59.412935 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 4 17:49:59.415526 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 4 17:49:59.418141 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 4 17:49:59.420947 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 4 17:49:59.424020 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 4 17:49:59.446464 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (707) Sep 4 17:49:59.446508 kernel: BTRFS info (device vda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 17:49:59.446520 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 17:49:59.450207 kernel: BTRFS info (device vda6): turning on async discard Sep 4 17:49:59.450230 kernel: BTRFS info (device vda6): enabling free space tree Sep 4 17:49:59.455077 kernel: BTRFS info (device vda6): last unmount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 17:49:59.456508 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 4 17:49:59.459415 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 4 17:49:59.547963 ignition[745]: Ignition 2.22.0 Sep 4 17:49:59.548593 ignition[745]: Stage: fetch-offline Sep 4 17:49:59.548641 ignition[745]: no configs at "/usr/lib/ignition/base.d" Sep 4 17:49:59.548653 ignition[745]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 17:49:59.548755 ignition[745]: parsed url from cmdline: "" Sep 4 17:49:59.548759 ignition[745]: no config URL provided Sep 4 17:49:59.548764 ignition[745]: reading system config file "/usr/lib/ignition/user.ign" Sep 4 17:49:59.548774 ignition[745]: no config at "/usr/lib/ignition/user.ign" Sep 4 17:49:59.548797 ignition[745]: op(1): [started] loading QEMU firmware config module Sep 4 17:49:59.548802 ignition[745]: op(1): executing: "modprobe" "qemu_fw_cfg" Sep 4 17:49:59.560681 ignition[745]: op(1): [finished] loading QEMU firmware config module Sep 4 17:49:59.560707 ignition[745]: QEMU firmware config was not found. Ignoring... Sep 4 17:49:59.571729 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 17:49:59.576414 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 17:49:59.603474 ignition[745]: parsing config with SHA512: 31e740c9b9f199abc289117b7c60ae0f508c031ff72b78e2bce4a3be9dc75993b7959fcef70b0434666575e0411561ac9259de2e19de630424a8104204526a4d Sep 4 17:49:59.609399 unknown[745]: fetched base config from "system" Sep 4 17:49:59.609413 unknown[745]: fetched user config from "qemu" Sep 4 17:49:59.609827 ignition[745]: fetch-offline: fetch-offline passed Sep 4 17:49:59.609899 ignition[745]: Ignition finished successfully Sep 4 17:49:59.613136 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 17:49:59.619894 systemd-networkd[853]: lo: Link UP Sep 4 17:49:59.619906 systemd-networkd[853]: lo: Gained carrier Sep 4 17:49:59.621483 systemd-networkd[853]: Enumeration completed Sep 4 17:49:59.621587 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 17:49:59.621854 systemd-networkd[853]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 17:49:59.621859 systemd-networkd[853]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 17:49:59.622082 systemd[1]: Reached target network.target - Network. Sep 4 17:49:59.622714 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 4 17:49:59.623478 systemd-networkd[853]: eth0: Link UP Sep 4 17:49:59.623541 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 4 17:49:59.623623 systemd-networkd[853]: eth0: Gained carrier Sep 4 17:49:59.623632 systemd-networkd[853]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 17:49:59.639111 systemd-networkd[853]: eth0: DHCPv4 address 10.0.0.60/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 4 17:49:59.656462 ignition[857]: Ignition 2.22.0 Sep 4 17:49:59.656475 ignition[857]: Stage: kargs Sep 4 17:49:59.656611 ignition[857]: no configs at "/usr/lib/ignition/base.d" Sep 4 17:49:59.656623 ignition[857]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 17:49:59.657363 ignition[857]: kargs: kargs passed Sep 4 17:49:59.657409 ignition[857]: Ignition finished successfully Sep 4 17:49:59.662624 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 4 17:49:59.664193 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 4 17:49:59.702939 ignition[867]: Ignition 2.22.0 Sep 4 17:49:59.702953 ignition[867]: Stage: disks Sep 4 17:49:59.703100 ignition[867]: no configs at "/usr/lib/ignition/base.d" Sep 4 17:49:59.703111 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 17:49:59.703856 ignition[867]: disks: disks passed Sep 4 17:49:59.703903 ignition[867]: Ignition finished successfully Sep 4 17:49:59.708502 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 4 17:49:59.710728 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 4 17:49:59.711017 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 4 17:49:59.713261 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 17:49:59.713586 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 17:49:59.713904 systemd[1]: Reached target basic.target - Basic System. Sep 4 17:49:59.720697 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 4 17:49:59.758963 systemd-fsck[877]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 4 17:49:59.814261 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 4 17:49:59.818376 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 4 17:49:59.928074 kernel: EXT4-fs (vda9): mounted filesystem b9579306-9cef-42ea-893b-17169f1ea8af r/w with ordered data mode. Quota mode: none. Sep 4 17:49:59.928629 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 4 17:49:59.929491 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 4 17:49:59.932445 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 17:49:59.934825 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 4 17:49:59.935489 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 4 17:49:59.935530 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 4 17:49:59.935553 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 17:49:59.949066 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 4 17:49:59.952079 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 4 17:49:59.957010 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (885) Sep 4 17:49:59.957040 kernel: BTRFS info (device vda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 17:49:59.957067 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 17:49:59.961072 kernel: BTRFS info (device vda6): turning on async discard Sep 4 17:49:59.961125 kernel: BTRFS info (device vda6): enabling free space tree Sep 4 17:49:59.962952 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 17:49:59.988996 initrd-setup-root[909]: cut: /sysroot/etc/passwd: No such file or directory Sep 4 17:49:59.993148 initrd-setup-root[916]: cut: /sysroot/etc/group: No such file or directory Sep 4 17:49:59.997294 initrd-setup-root[923]: cut: /sysroot/etc/shadow: No such file or directory Sep 4 17:50:00.001221 initrd-setup-root[930]: cut: /sysroot/etc/gshadow: No such file or directory Sep 4 17:50:00.089060 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 4 17:50:00.092381 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 4 17:50:00.094956 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 4 17:50:00.117113 kernel: BTRFS info (device vda6): last unmount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 17:50:00.129243 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 4 17:50:00.146083 ignition[998]: INFO : Ignition 2.22.0 Sep 4 17:50:00.146083 ignition[998]: INFO : Stage: mount Sep 4 17:50:00.147996 ignition[998]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 17:50:00.147996 ignition[998]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 17:50:00.147996 ignition[998]: INFO : mount: mount passed Sep 4 17:50:00.147996 ignition[998]: INFO : Ignition finished successfully Sep 4 17:50:00.155172 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 4 17:50:00.156981 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 4 17:50:00.405215 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 4 17:50:00.407717 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 4 17:50:00.432941 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1011) Sep 4 17:50:00.432968 kernel: BTRFS info (device vda6): first mount of filesystem acf6a5d7-9c2b-468d-9430-e8b3ed6a78f4 Sep 4 17:50:00.432980 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Sep 4 17:50:00.437071 kernel: BTRFS info (device vda6): turning on async discard Sep 4 17:50:00.437091 kernel: BTRFS info (device vda6): enabling free space tree Sep 4 17:50:00.438591 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 4 17:50:00.481528 ignition[1028]: INFO : Ignition 2.22.0 Sep 4 17:50:00.481528 ignition[1028]: INFO : Stage: files Sep 4 17:50:00.483325 ignition[1028]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 17:50:00.483325 ignition[1028]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 17:50:00.485956 ignition[1028]: DEBUG : files: compiled without relabeling support, skipping Sep 4 17:50:00.487338 ignition[1028]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 4 17:50:00.487338 ignition[1028]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 4 17:50:00.492304 ignition[1028]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 4 17:50:00.493691 ignition[1028]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 4 17:50:00.493691 ignition[1028]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 4 17:50:00.493100 unknown[1028]: wrote ssh authorized keys file for user: core Sep 4 17:50:00.498164 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 4 17:50:00.500407 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 4 17:50:00.556500 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 4 17:50:01.256147 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 4 17:50:01.258206 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 4 17:50:01.259830 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 4 17:50:01.259830 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 4 17:50:01.263106 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 4 17:50:01.264700 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 17:50:01.266409 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 4 17:50:01.268018 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 17:50:01.268310 systemd-networkd[853]: eth0: Gained IPv6LL Sep 4 17:50:01.270771 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 4 17:50:01.274730 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 17:50:01.276579 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 4 17:50:01.278231 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 17:50:01.282663 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 17:50:01.285187 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 17:50:01.285187 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 4 17:50:01.735625 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 4 17:50:02.172959 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 4 17:50:02.172959 ignition[1028]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 4 17:50:02.176715 ignition[1028]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 17:50:02.183226 ignition[1028]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 4 17:50:02.183226 ignition[1028]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 4 17:50:02.183226 ignition[1028]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 4 17:50:02.187452 ignition[1028]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 4 17:50:02.189272 ignition[1028]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Sep 4 17:50:02.189272 ignition[1028]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 4 17:50:02.189272 ignition[1028]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Sep 4 17:50:02.211987 ignition[1028]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Sep 4 17:50:02.219219 ignition[1028]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Sep 4 17:50:02.221014 ignition[1028]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Sep 4 17:50:02.221014 ignition[1028]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Sep 4 17:50:02.223750 ignition[1028]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Sep 4 17:50:02.223750 ignition[1028]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 4 17:50:02.223750 ignition[1028]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 4 17:50:02.223750 ignition[1028]: INFO : files: files passed Sep 4 17:50:02.223750 ignition[1028]: INFO : Ignition finished successfully Sep 4 17:50:02.229991 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 4 17:50:02.231782 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 4 17:50:02.234141 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 4 17:50:02.251378 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 4 17:50:02.251516 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 4 17:50:02.255644 initrd-setup-root-after-ignition[1058]: grep: /sysroot/oem/oem-release: No such file or directory Sep 4 17:50:02.259771 initrd-setup-root-after-ignition[1060]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 17:50:02.259771 initrd-setup-root-after-ignition[1060]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 4 17:50:02.262926 initrd-setup-root-after-ignition[1064]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 4 17:50:02.264571 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 17:50:02.267437 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 4 17:50:02.270282 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 4 17:50:02.317896 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 4 17:50:02.318092 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 4 17:50:02.319558 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 4 17:50:02.321436 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 4 17:50:02.323481 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 4 17:50:02.325007 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 4 17:50:02.432943 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 17:50:02.435632 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 4 17:50:02.454225 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 4 17:50:02.455459 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 17:50:02.457577 systemd[1]: Stopped target timers.target - Timer Units. Sep 4 17:50:02.459505 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 4 17:50:02.459622 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 4 17:50:02.461663 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 4 17:50:02.463304 systemd[1]: Stopped target basic.target - Basic System. Sep 4 17:50:02.465245 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 4 17:50:02.467179 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 4 17:50:02.469105 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 4 17:50:02.471129 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 4 17:50:02.473242 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 4 17:50:02.475214 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 4 17:50:02.477371 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 4 17:50:02.479270 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 4 17:50:02.481347 systemd[1]: Stopped target swap.target - Swaps. Sep 4 17:50:02.483043 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 4 17:50:02.483170 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 4 17:50:02.485189 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 4 17:50:02.486719 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 17:50:02.488689 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 4 17:50:02.488827 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 17:50:02.490799 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 4 17:50:02.490909 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 4 17:50:02.492996 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 4 17:50:02.493123 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 4 17:50:02.495029 systemd[1]: Stopped target paths.target - Path Units. Sep 4 17:50:02.496685 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 4 17:50:02.501191 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 17:50:02.503247 systemd[1]: Stopped target slices.target - Slice Units. Sep 4 17:50:02.504825 systemd[1]: Stopped target sockets.target - Socket Units. Sep 4 17:50:02.506710 systemd[1]: iscsid.socket: Deactivated successfully. Sep 4 17:50:02.506822 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 4 17:50:02.509014 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 4 17:50:02.509130 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 4 17:50:02.510820 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 4 17:50:02.510954 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 4 17:50:02.512824 systemd[1]: ignition-files.service: Deactivated successfully. Sep 4 17:50:02.512934 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 4 17:50:02.515534 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 4 17:50:02.517780 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 4 17:50:02.518728 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 4 17:50:02.518884 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 17:50:02.520945 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 4 17:50:02.521076 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 4 17:50:02.527574 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 4 17:50:02.527677 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 4 17:50:02.546894 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 4 17:50:02.558861 ignition[1084]: INFO : Ignition 2.22.0 Sep 4 17:50:02.558861 ignition[1084]: INFO : Stage: umount Sep 4 17:50:02.560664 ignition[1084]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 4 17:50:02.560664 ignition[1084]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Sep 4 17:50:02.560664 ignition[1084]: INFO : umount: umount passed Sep 4 17:50:02.560664 ignition[1084]: INFO : Ignition finished successfully Sep 4 17:50:02.566072 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 4 17:50:02.566222 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 4 17:50:02.567129 systemd[1]: Stopped target network.target - Network. Sep 4 17:50:02.570376 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 4 17:50:02.570452 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 4 17:50:02.572335 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 4 17:50:02.572384 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 4 17:50:02.572927 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 4 17:50:02.572978 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 4 17:50:02.573484 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 4 17:50:02.573526 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 4 17:50:02.573916 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 4 17:50:02.574390 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 4 17:50:02.589819 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 4 17:50:02.589964 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 4 17:50:02.594390 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 4 17:50:02.594645 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 4 17:50:02.594789 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 4 17:50:02.598721 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 4 17:50:02.599514 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 4 17:50:02.601976 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 4 17:50:02.602027 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 4 17:50:02.605037 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 4 17:50:02.606778 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 4 17:50:02.606835 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 4 17:50:02.607650 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 4 17:50:02.607696 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 4 17:50:02.611869 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 4 17:50:02.611915 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 4 17:50:02.612461 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 4 17:50:02.612508 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 17:50:02.616911 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 17:50:02.618488 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 4 17:50:02.618555 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 4 17:50:02.638080 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 4 17:50:02.643295 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 17:50:02.643922 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 4 17:50:02.643971 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 4 17:50:02.645945 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 4 17:50:02.645984 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 17:50:02.646249 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 4 17:50:02.646295 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 4 17:50:02.646909 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 4 17:50:02.646958 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 4 17:50:02.647701 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 4 17:50:02.647750 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 4 17:50:02.657122 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 4 17:50:02.657726 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 4 17:50:02.657786 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 17:50:02.661494 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 4 17:50:02.661546 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 17:50:02.664604 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 4 17:50:02.664654 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 17:50:02.667916 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 4 17:50:02.667964 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 17:50:02.668468 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 17:50:02.668511 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:50:02.674665 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 4 17:50:02.674729 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 4 17:50:02.674773 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 4 17:50:02.674820 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 4 17:50:02.675161 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 4 17:50:02.678400 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 4 17:50:02.686586 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 4 17:50:02.686725 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 4 17:50:02.742701 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 4 17:50:02.742857 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 4 17:50:02.743695 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 4 17:50:02.747338 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 4 17:50:02.747416 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 4 17:50:02.748657 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 4 17:50:02.772112 systemd[1]: Switching root. Sep 4 17:50:02.813503 systemd-journald[218]: Journal stopped Sep 4 17:50:04.053225 systemd-journald[218]: Received SIGTERM from PID 1 (systemd). Sep 4 17:50:04.053308 kernel: SELinux: policy capability network_peer_controls=1 Sep 4 17:50:04.053323 kernel: SELinux: policy capability open_perms=1 Sep 4 17:50:04.053335 kernel: SELinux: policy capability extended_socket_class=1 Sep 4 17:50:04.053346 kernel: SELinux: policy capability always_check_network=0 Sep 4 17:50:04.053357 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 4 17:50:04.053382 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 4 17:50:04.053400 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 4 17:50:04.053411 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 4 17:50:04.053422 kernel: SELinux: policy capability userspace_initial_context=0 Sep 4 17:50:04.053434 kernel: audit: type=1403 audit(1757008203.252:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 4 17:50:04.053447 systemd[1]: Successfully loaded SELinux policy in 64.715ms. Sep 4 17:50:04.053468 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.544ms. Sep 4 17:50:04.053487 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 4 17:50:04.053500 systemd[1]: Detected virtualization kvm. Sep 4 17:50:04.053514 systemd[1]: Detected architecture x86-64. Sep 4 17:50:04.053527 systemd[1]: Detected first boot. Sep 4 17:50:04.053538 systemd[1]: Initializing machine ID from VM UUID. Sep 4 17:50:04.053550 zram_generator::config[1135]: No configuration found. Sep 4 17:50:04.053581 kernel: Guest personality initialized and is inactive Sep 4 17:50:04.053593 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 4 17:50:04.053605 kernel: Initialized host personality Sep 4 17:50:04.053621 kernel: NET: Registered PF_VSOCK protocol family Sep 4 17:50:04.053640 systemd[1]: Populated /etc with preset unit settings. Sep 4 17:50:04.053653 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 4 17:50:04.053675 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 4 17:50:04.053687 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 4 17:50:04.053699 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 4 17:50:04.053712 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 4 17:50:04.053724 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 4 17:50:04.053736 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 4 17:50:04.053748 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 4 17:50:04.053766 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 4 17:50:04.053778 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 4 17:50:04.053790 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 4 17:50:04.053802 systemd[1]: Created slice user.slice - User and Session Slice. Sep 4 17:50:04.053814 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 4 17:50:04.053827 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 4 17:50:04.053839 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 4 17:50:04.053851 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 4 17:50:04.053866 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 4 17:50:04.053878 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 4 17:50:04.053890 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 4 17:50:04.053902 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 4 17:50:04.053914 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 4 17:50:04.053926 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 4 17:50:04.053945 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 4 17:50:04.053957 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 4 17:50:04.053972 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 4 17:50:04.053985 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 4 17:50:04.053997 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 4 17:50:04.054009 systemd[1]: Reached target slices.target - Slice Units. Sep 4 17:50:04.054020 systemd[1]: Reached target swap.target - Swaps. Sep 4 17:50:04.054032 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 4 17:50:04.054059 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 4 17:50:04.054071 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 4 17:50:04.054083 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 4 17:50:04.054099 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 4 17:50:04.054110 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 4 17:50:04.054122 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 4 17:50:04.054134 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 4 17:50:04.054146 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 4 17:50:04.054158 systemd[1]: Mounting media.mount - External Media Directory... Sep 4 17:50:04.054171 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:50:04.054187 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 4 17:50:04.054199 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 4 17:50:04.054214 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 4 17:50:04.054234 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 4 17:50:04.054247 systemd[1]: Reached target machines.target - Containers. Sep 4 17:50:04.054262 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 4 17:50:04.054277 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 17:50:04.054292 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 4 17:50:04.054312 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 4 17:50:04.054325 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 17:50:04.054342 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 17:50:04.054355 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 17:50:04.054374 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 4 17:50:04.054386 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 17:50:04.054398 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 4 17:50:04.054411 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 4 17:50:04.054423 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 4 17:50:04.054435 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 4 17:50:04.054447 systemd[1]: Stopped systemd-fsck-usr.service. Sep 4 17:50:04.054464 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 17:50:04.054477 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 4 17:50:04.054489 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 4 17:50:04.054501 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 4 17:50:04.054512 kernel: loop: module loaded Sep 4 17:50:04.054537 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 4 17:50:04.054549 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 4 17:50:04.054560 kernel: ACPI: bus type drm_connector registered Sep 4 17:50:04.054572 kernel: fuse: init (API version 7.41) Sep 4 17:50:04.054583 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 4 17:50:04.054600 systemd[1]: verity-setup.service: Deactivated successfully. Sep 4 17:50:04.054612 systemd[1]: Stopped verity-setup.service. Sep 4 17:50:04.054624 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:50:04.054637 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 4 17:50:04.054678 systemd-journald[1201]: Collecting audit messages is disabled. Sep 4 17:50:04.054709 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 4 17:50:04.054723 systemd-journald[1201]: Journal started Sep 4 17:50:04.054751 systemd-journald[1201]: Runtime Journal (/run/log/journal/23ce83d2c05e484d99b02de9101999b3) is 6M, max 48.4M, 42.4M free. Sep 4 17:50:04.059257 systemd[1]: Mounted media.mount - External Media Directory. Sep 4 17:50:04.059333 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 4 17:50:03.820793 systemd[1]: Queued start job for default target multi-user.target. Sep 4 17:50:03.836267 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Sep 4 17:50:03.836771 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 4 17:50:04.061241 systemd[1]: Started systemd-journald.service - Journal Service. Sep 4 17:50:04.062872 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 4 17:50:04.064270 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 4 17:50:04.065728 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 4 17:50:04.067387 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 4 17:50:04.068978 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 4 17:50:04.069396 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 4 17:50:04.070945 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 17:50:04.071258 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 17:50:04.072741 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 17:50:04.072986 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 17:50:04.074392 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 17:50:04.074697 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 17:50:04.076285 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 4 17:50:04.076544 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 4 17:50:04.077913 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 17:50:04.078186 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 17:50:04.079726 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 4 17:50:04.081303 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 4 17:50:04.082894 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 4 17:50:04.084462 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 4 17:50:04.098552 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 4 17:50:04.101098 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 4 17:50:04.103226 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 4 17:50:04.104479 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 4 17:50:04.104509 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 4 17:50:04.105873 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 4 17:50:04.114148 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 4 17:50:04.115497 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 17:50:04.116967 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 4 17:50:04.121175 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 4 17:50:04.122351 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 17:50:04.124147 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 4 17:50:04.125998 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 17:50:04.134193 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 4 17:50:04.136380 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 4 17:50:04.151625 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 4 17:50:04.161722 systemd-journald[1201]: Time spent on flushing to /var/log/journal/23ce83d2c05e484d99b02de9101999b3 is 15.097ms for 1073 entries. Sep 4 17:50:04.161722 systemd-journald[1201]: System Journal (/var/log/journal/23ce83d2c05e484d99b02de9101999b3) is 8M, max 195.6M, 187.6M free. Sep 4 17:50:04.186758 systemd-journald[1201]: Received client request to flush runtime journal. Sep 4 17:50:04.186794 kernel: loop0: detected capacity change from 0 to 110984 Sep 4 17:50:04.157151 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 4 17:50:04.158703 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 4 17:50:04.163134 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 4 17:50:04.166694 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 4 17:50:04.169507 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 4 17:50:04.171028 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 4 17:50:04.176410 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 4 17:50:04.189435 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 4 17:50:04.204154 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 4 17:50:04.211482 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. Sep 4 17:50:04.211501 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. Sep 4 17:50:04.216092 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 4 17:50:04.221123 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 4 17:50:04.224092 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 4 17:50:04.228124 kernel: loop1: detected capacity change from 0 to 221472 Sep 4 17:50:04.261087 kernel: loop2: detected capacity change from 0 to 128016 Sep 4 17:50:04.265719 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 4 17:50:04.271885 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 4 17:50:04.294083 kernel: loop3: detected capacity change from 0 to 110984 Sep 4 17:50:04.302369 systemd-tmpfiles[1274]: ACLs are not supported, ignoring. Sep 4 17:50:04.302686 systemd-tmpfiles[1274]: ACLs are not supported, ignoring. Sep 4 17:50:04.307072 kernel: loop4: detected capacity change from 0 to 221472 Sep 4 17:50:04.307253 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 4 17:50:04.320913 kernel: loop5: detected capacity change from 0 to 128016 Sep 4 17:50:04.329448 (sd-merge)[1276]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Sep 4 17:50:04.330098 (sd-merge)[1276]: Merged extensions into '/usr'. Sep 4 17:50:04.334764 systemd[1]: Reload requested from client PID 1250 ('systemd-sysext') (unit systemd-sysext.service)... Sep 4 17:50:04.334781 systemd[1]: Reloading... Sep 4 17:50:04.401125 zram_generator::config[1307]: No configuration found. Sep 4 17:50:04.555623 ldconfig[1245]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 4 17:50:04.649035 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 4 17:50:04.649168 systemd[1]: Reloading finished in 313 ms. Sep 4 17:50:04.679971 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 4 17:50:04.681865 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 4 17:50:04.705359 systemd[1]: Starting ensure-sysext.service... Sep 4 17:50:04.707665 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 4 17:50:04.731260 systemd[1]: Reload requested from client PID 1341 ('systemctl') (unit ensure-sysext.service)... Sep 4 17:50:04.731279 systemd[1]: Reloading... Sep 4 17:50:04.741866 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 4 17:50:04.741949 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 4 17:50:04.742756 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 4 17:50:04.743149 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 4 17:50:04.744437 systemd-tmpfiles[1342]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 4 17:50:04.744727 systemd-tmpfiles[1342]: ACLs are not supported, ignoring. Sep 4 17:50:04.744796 systemd-tmpfiles[1342]: ACLs are not supported, ignoring. Sep 4 17:50:04.750437 systemd-tmpfiles[1342]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 17:50:04.750449 systemd-tmpfiles[1342]: Skipping /boot Sep 4 17:50:04.763490 systemd-tmpfiles[1342]: Detected autofs mount point /boot during canonicalization of boot. Sep 4 17:50:04.763504 systemd-tmpfiles[1342]: Skipping /boot Sep 4 17:50:04.840430 zram_generator::config[1369]: No configuration found. Sep 4 17:50:05.024827 systemd[1]: Reloading finished in 293 ms. Sep 4 17:50:05.049732 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 4 17:50:05.070351 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 4 17:50:05.079392 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 17:50:05.082013 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 4 17:50:05.084426 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 4 17:50:05.102821 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 4 17:50:05.106229 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 4 17:50:05.109240 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 4 17:50:05.114536 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:50:05.114708 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 17:50:05.119235 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 17:50:05.122153 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 17:50:05.125486 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 17:50:05.128122 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 17:50:05.128246 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 17:50:05.130027 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 4 17:50:05.132284 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:50:05.134123 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 4 17:50:05.135847 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 17:50:05.141649 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 17:50:05.147195 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 17:50:05.147433 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 17:50:05.150666 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 17:50:05.150919 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 17:50:05.151387 systemd-udevd[1412]: Using default interface naming scheme 'v255'. Sep 4 17:50:05.160704 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:50:05.160936 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 4 17:50:05.163494 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 4 17:50:05.167015 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 4 17:50:05.175321 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 4 17:50:05.178347 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 4 17:50:05.179482 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 4 17:50:05.179593 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 4 17:50:05.180442 augenrules[1445]: No rules Sep 4 17:50:05.181386 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 4 17:50:05.182460 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 4 17:50:05.184628 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 4 17:50:05.186577 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 17:50:05.197158 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 17:50:05.198849 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 4 17:50:05.200440 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 4 17:50:05.202326 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 4 17:50:05.204038 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 4 17:50:05.204277 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 4 17:50:05.206062 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 4 17:50:05.206346 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 4 17:50:05.210392 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 4 17:50:05.210634 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 4 17:50:05.212444 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 4 17:50:05.212660 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 4 17:50:05.214298 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 4 17:50:05.231212 systemd[1]: Finished ensure-sysext.service. Sep 4 17:50:05.239345 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 4 17:50:05.240499 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 4 17:50:05.240564 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 4 17:50:05.242894 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 4 17:50:05.244152 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 4 17:50:05.293403 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 4 17:50:05.334356 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Sep 4 17:50:05.337408 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 4 17:50:05.343073 kernel: mousedev: PS/2 mouse device common for all mice Sep 4 17:50:05.361097 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 4 17:50:05.364532 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 4 17:50:05.367065 kernel: ACPI: button: Power Button [PWRF] Sep 4 17:50:05.369091 systemd-resolved[1411]: Positive Trust Anchors: Sep 4 17:50:05.369104 systemd-resolved[1411]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 4 17:50:05.369134 systemd-resolved[1411]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 4 17:50:05.373568 systemd-resolved[1411]: Defaulting to hostname 'linux'. Sep 4 17:50:05.375775 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 4 17:50:05.377109 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 4 17:50:05.417197 systemd-networkd[1492]: lo: Link UP Sep 4 17:50:05.417208 systemd-networkd[1492]: lo: Gained carrier Sep 4 17:50:05.421937 systemd-networkd[1492]: Enumeration completed Sep 4 17:50:05.422185 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 4 17:50:05.422373 systemd-networkd[1492]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 17:50:05.422378 systemd-networkd[1492]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 4 17:50:05.423642 systemd-networkd[1492]: eth0: Link UP Sep 4 17:50:05.423821 systemd-networkd[1492]: eth0: Gained carrier Sep 4 17:50:05.423844 systemd-networkd[1492]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 4 17:50:05.424149 systemd[1]: Reached target network.target - Network. Sep 4 17:50:05.428218 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 4 17:50:05.440139 systemd-networkd[1492]: eth0: DHCPv4 address 10.0.0.60/16, gateway 10.0.0.1 acquired from 10.0.0.1 Sep 4 17:50:06.964978 systemd-resolved[1411]: Clock change detected. Flushing caches. Sep 4 17:50:06.965633 systemd-timesyncd[1493]: Contacted time server 10.0.0.1:123 (10.0.0.1). Sep 4 17:50:06.965689 systemd-timesyncd[1493]: Initial clock synchronization to Thu 2025-09-04 17:50:06.964935 UTC. Sep 4 17:50:06.968464 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 4 17:50:06.969726 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 4 17:50:06.971513 systemd[1]: Reached target sysinit.target - System Initialization. Sep 4 17:50:06.972663 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 4 17:50:06.973887 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 4 17:50:06.978044 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Sep 4 17:50:06.978383 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 4 17:50:06.978560 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 4 17:50:06.979012 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 4 17:50:06.980154 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 4 17:50:06.981353 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 4 17:50:06.981376 systemd[1]: Reached target paths.target - Path Units. Sep 4 17:50:06.982461 systemd[1]: Reached target time-set.target - System Time Set. Sep 4 17:50:06.984457 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 4 17:50:06.987342 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 4 17:50:06.988566 systemd[1]: Reached target timers.target - Timer Units. Sep 4 17:50:06.990531 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 4 17:50:06.994971 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 4 17:50:07.000525 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 4 17:50:07.002679 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 4 17:50:07.003928 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 4 17:50:07.013010 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 4 17:50:07.014685 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 4 17:50:07.017059 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 4 17:50:07.019998 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 4 17:50:07.027833 systemd[1]: Reached target sockets.target - Socket Units. Sep 4 17:50:07.030291 systemd[1]: Reached target basic.target - Basic System. Sep 4 17:50:07.031343 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 4 17:50:07.031448 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 4 17:50:07.032959 systemd[1]: Starting containerd.service - containerd container runtime... Sep 4 17:50:07.035395 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 4 17:50:07.038432 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 4 17:50:07.041393 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 4 17:50:07.043911 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 4 17:50:07.045190 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 4 17:50:07.047414 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 4 17:50:07.133056 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 4 17:50:07.136672 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 4 17:50:07.142554 oslogin_cache_refresh[1536]: Refreshing passwd entry cache Sep 4 17:50:07.143427 google_oslogin_nss_cache[1536]: oslogin_cache_refresh[1536]: Refreshing passwd entry cache Sep 4 17:50:07.145602 jq[1534]: false Sep 4 17:50:07.147184 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 4 17:50:07.150578 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 4 17:50:07.151932 google_oslogin_nss_cache[1536]: oslogin_cache_refresh[1536]: Failure getting users, quitting Sep 4 17:50:07.151932 google_oslogin_nss_cache[1536]: oslogin_cache_refresh[1536]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 17:50:07.151932 google_oslogin_nss_cache[1536]: oslogin_cache_refresh[1536]: Refreshing group entry cache Sep 4 17:50:07.151472 oslogin_cache_refresh[1536]: Failure getting users, quitting Sep 4 17:50:07.151488 oslogin_cache_refresh[1536]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 4 17:50:07.151536 oslogin_cache_refresh[1536]: Refreshing group entry cache Sep 4 17:50:07.157553 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 4 17:50:07.161185 google_oslogin_nss_cache[1536]: oslogin_cache_refresh[1536]: Failure getting groups, quitting Sep 4 17:50:07.161185 google_oslogin_nss_cache[1536]: oslogin_cache_refresh[1536]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 17:50:07.158706 oslogin_cache_refresh[1536]: Failure getting groups, quitting Sep 4 17:50:07.158716 oslogin_cache_refresh[1536]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 4 17:50:07.162648 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 17:50:07.165439 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 4 17:50:07.168099 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 4 17:50:07.170700 systemd[1]: Starting update-engine.service - Update Engine... Sep 4 17:50:07.173509 extend-filesystems[1535]: Found /dev/vda6 Sep 4 17:50:07.174700 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 4 17:50:07.191192 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 4 17:50:07.196273 jq[1551]: true Sep 4 17:50:07.192867 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 4 17:50:07.193127 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 4 17:50:07.193592 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 4 17:50:07.193831 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 4 17:50:07.197770 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 4 17:50:07.200235 extend-filesystems[1535]: Found /dev/vda9 Sep 4 17:50:07.202886 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 4 17:50:07.205679 extend-filesystems[1535]: Checking size of /dev/vda9 Sep 4 17:50:07.216310 update_engine[1550]: I20250904 17:50:07.209568 1550 main.cc:92] Flatcar Update Engine starting Sep 4 17:50:07.222037 systemd[1]: motdgen.service: Deactivated successfully. Sep 4 17:50:07.222132 (ntainerd)[1564]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 4 17:50:07.222544 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 4 17:50:07.234156 jq[1562]: true Sep 4 17:50:07.244865 kernel: kvm_amd: TSC scaling supported Sep 4 17:50:07.244955 kernel: kvm_amd: Nested Virtualization enabled Sep 4 17:50:07.244969 kernel: kvm_amd: Nested Paging enabled Sep 4 17:50:07.244982 kernel: kvm_amd: LBR virtualization supported Sep 4 17:50:07.246014 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Sep 4 17:50:07.246038 kernel: kvm_amd: Virtual GIF supported Sep 4 17:50:07.248188 tar[1560]: linux-amd64/helm Sep 4 17:50:07.248549 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 4 17:50:07.249049 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:50:07.257860 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 4 17:50:07.261264 extend-filesystems[1535]: Resized partition /dev/vda9 Sep 4 17:50:07.264184 extend-filesystems[1581]: resize2fs 1.47.3 (8-Jul-2025) Sep 4 17:50:07.267418 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 4 17:50:07.272529 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Sep 4 17:50:07.302197 dbus-daemon[1532]: [system] SELinux support is enabled Sep 4 17:50:07.302638 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 4 17:50:07.309413 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 4 17:50:07.309446 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 4 17:50:07.311594 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 4 17:50:07.311619 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 4 17:50:07.314131 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Sep 4 17:50:07.334305 systemd[1]: Started update-engine.service - Update Engine. Sep 4 17:50:07.393160 update_engine[1550]: I20250904 17:50:07.380410 1550 update_check_scheduler.cc:74] Next update check in 5m37s Sep 4 17:50:07.390302 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 4 17:50:07.395366 extend-filesystems[1581]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Sep 4 17:50:07.395366 extend-filesystems[1581]: old_desc_blocks = 1, new_desc_blocks = 1 Sep 4 17:50:07.395366 extend-filesystems[1581]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Sep 4 17:50:07.398826 extend-filesystems[1535]: Resized filesystem in /dev/vda9 Sep 4 17:50:07.399125 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 4 17:50:07.399417 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 4 17:50:07.401909 systemd-logind[1545]: Watching system buttons on /dev/input/event2 (Power Button) Sep 4 17:50:07.401938 systemd-logind[1545]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 4 17:50:07.402366 systemd-logind[1545]: New seat seat0. Sep 4 17:50:07.406155 bash[1597]: Updated "/home/core/.ssh/authorized_keys" Sep 4 17:50:07.448133 kernel: EDAC MC: Ver: 3.0.0 Sep 4 17:50:07.451394 systemd[1]: Started systemd-logind.service - User Login Management. Sep 4 17:50:07.453858 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 4 17:50:07.455448 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 4 17:50:07.469400 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Sep 4 17:50:07.473449 locksmithd[1601]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 4 17:50:07.577913 sshd_keygen[1554]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 4 17:50:07.609969 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 4 17:50:07.613217 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 4 17:50:07.634771 systemd[1]: issuegen.service: Deactivated successfully. Sep 4 17:50:07.635233 containerd[1564]: time="2025-09-04T17:50:07Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 4 17:50:07.635489 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 4 17:50:07.635854 containerd[1564]: time="2025-09-04T17:50:07.635557123Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 4 17:50:07.643042 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 4 17:50:07.649189 containerd[1564]: time="2025-09-04T17:50:07.648216629Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="13.675µs" Sep 4 17:50:07.649189 containerd[1564]: time="2025-09-04T17:50:07.648266813Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 4 17:50:07.649189 containerd[1564]: time="2025-09-04T17:50:07.648309343Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 4 17:50:07.649189 containerd[1564]: time="2025-09-04T17:50:07.648542780Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 4 17:50:07.649189 containerd[1564]: time="2025-09-04T17:50:07.648564030Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 4 17:50:07.649189 containerd[1564]: time="2025-09-04T17:50:07.648618092Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 17:50:07.649189 containerd[1564]: time="2025-09-04T17:50:07.648701157Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 4 17:50:07.649189 containerd[1564]: time="2025-09-04T17:50:07.648719552Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 17:50:07.649513 containerd[1564]: time="2025-09-04T17:50:07.649478085Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 4 17:50:07.649576 containerd[1564]: time="2025-09-04T17:50:07.649557474Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 17:50:07.649644 containerd[1564]: time="2025-09-04T17:50:07.649626062Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 4 17:50:07.649693 containerd[1564]: time="2025-09-04T17:50:07.649680514Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 4 17:50:07.649856 containerd[1564]: time="2025-09-04T17:50:07.649835505Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 4 17:50:07.650433 containerd[1564]: time="2025-09-04T17:50:07.650406787Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 17:50:07.650537 containerd[1564]: time="2025-09-04T17:50:07.650519147Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 4 17:50:07.650587 containerd[1564]: time="2025-09-04T17:50:07.650575463Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 4 17:50:07.650676 containerd[1564]: time="2025-09-04T17:50:07.650661575Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 4 17:50:07.651013 containerd[1564]: time="2025-09-04T17:50:07.650993968Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 4 17:50:07.651209 containerd[1564]: time="2025-09-04T17:50:07.651192611Z" level=info msg="metadata content store policy set" policy=shared Sep 4 17:50:07.656636 containerd[1564]: time="2025-09-04T17:50:07.656591898Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 4 17:50:07.656685 containerd[1564]: time="2025-09-04T17:50:07.656655137Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 4 17:50:07.656685 containerd[1564]: time="2025-09-04T17:50:07.656677479Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 4 17:50:07.656723 containerd[1564]: time="2025-09-04T17:50:07.656690794Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 4 17:50:07.656723 containerd[1564]: time="2025-09-04T17:50:07.656704990Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 4 17:50:07.656723 containerd[1564]: time="2025-09-04T17:50:07.656717714Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 4 17:50:07.656791 containerd[1564]: time="2025-09-04T17:50:07.656733083Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 4 17:50:07.656791 containerd[1564]: time="2025-09-04T17:50:07.656748302Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 4 17:50:07.656791 containerd[1564]: time="2025-09-04T17:50:07.656780252Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 4 17:50:07.656845 containerd[1564]: time="2025-09-04T17:50:07.656793857Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 4 17:50:07.656845 containerd[1564]: time="2025-09-04T17:50:07.656804828Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 4 17:50:07.656845 containerd[1564]: time="2025-09-04T17:50:07.656831397Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 4 17:50:07.657004 containerd[1564]: time="2025-09-04T17:50:07.656975147Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 4 17:50:07.657035 containerd[1564]: time="2025-09-04T17:50:07.657018288Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 4 17:50:07.657055 containerd[1564]: time="2025-09-04T17:50:07.657046240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 4 17:50:07.657075 containerd[1564]: time="2025-09-04T17:50:07.657067190Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 4 17:50:07.657096 containerd[1564]: time="2025-09-04T17:50:07.657079393Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 4 17:50:07.657096 containerd[1564]: time="2025-09-04T17:50:07.657091505Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 4 17:50:07.657178 containerd[1564]: time="2025-09-04T17:50:07.657121321Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 4 17:50:07.657178 containerd[1564]: time="2025-09-04T17:50:07.657135588Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 4 17:50:07.657178 containerd[1564]: time="2025-09-04T17:50:07.657147180Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 4 17:50:07.657178 containerd[1564]: time="2025-09-04T17:50:07.657158551Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 4 17:50:07.657178 containerd[1564]: time="2025-09-04T17:50:07.657169041Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 4 17:50:07.657288 containerd[1564]: time="2025-09-04T17:50:07.657249973Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 4 17:50:07.657288 containerd[1564]: time="2025-09-04T17:50:07.657265341Z" level=info msg="Start snapshots syncer" Sep 4 17:50:07.657326 containerd[1564]: time="2025-09-04T17:50:07.657299556Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 4 17:50:07.657630 containerd[1564]: time="2025-09-04T17:50:07.657575704Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 4 17:50:07.657757 containerd[1564]: time="2025-09-04T17:50:07.657642940Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 4 17:50:07.659585 containerd[1564]: time="2025-09-04T17:50:07.659559925Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 4 17:50:07.659703 containerd[1564]: time="2025-09-04T17:50:07.659681614Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 4 17:50:07.659733 containerd[1564]: time="2025-09-04T17:50:07.659705899Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 4 17:50:07.659733 containerd[1564]: time="2025-09-04T17:50:07.659717571Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 4 17:50:07.659733 containerd[1564]: time="2025-09-04T17:50:07.659729353Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 4 17:50:07.659799 containerd[1564]: time="2025-09-04T17:50:07.659741165Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 4 17:50:07.659799 containerd[1564]: time="2025-09-04T17:50:07.659752687Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 4 17:50:07.659799 containerd[1564]: time="2025-09-04T17:50:07.659762876Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 4 17:50:07.659854 containerd[1564]: time="2025-09-04T17:50:07.659786951Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 4 17:50:07.659944 containerd[1564]: time="2025-09-04T17:50:07.659918829Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 4 17:50:07.659944 containerd[1564]: time="2025-09-04T17:50:07.659936692Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 4 17:50:07.659992 containerd[1564]: time="2025-09-04T17:50:07.659972249Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 17:50:07.659992 containerd[1564]: time="2025-09-04T17:50:07.659987497Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 4 17:50:07.660032 containerd[1564]: time="2025-09-04T17:50:07.659996264Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 17:50:07.660032 containerd[1564]: time="2025-09-04T17:50:07.660005641Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 4 17:50:07.660032 containerd[1564]: time="2025-09-04T17:50:07.660013797Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 4 17:50:07.660032 containerd[1564]: time="2025-09-04T17:50:07.660023735Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 4 17:50:07.660124 containerd[1564]: time="2025-09-04T17:50:07.660040377Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 4 17:50:07.660124 containerd[1564]: time="2025-09-04T17:50:07.660058701Z" level=info msg="runtime interface created" Sep 4 17:50:07.660124 containerd[1564]: time="2025-09-04T17:50:07.660064482Z" level=info msg="created NRI interface" Sep 4 17:50:07.660124 containerd[1564]: time="2025-09-04T17:50:07.660072978Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 4 17:50:07.660124 containerd[1564]: time="2025-09-04T17:50:07.660084239Z" level=info msg="Connect containerd service" Sep 4 17:50:07.660124 containerd[1564]: time="2025-09-04T17:50:07.660121960Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 4 17:50:07.660984 containerd[1564]: time="2025-09-04T17:50:07.660952788Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 4 17:50:07.675060 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 4 17:50:07.678987 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 4 17:50:07.683231 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 4 17:50:07.684526 systemd[1]: Reached target getty.target - Login Prompts. Sep 4 17:50:07.721802 tar[1560]: linux-amd64/LICENSE Sep 4 17:50:07.722049 tar[1560]: linux-amd64/README.md Sep 4 17:50:07.745505 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 4 17:50:07.781969 containerd[1564]: time="2025-09-04T17:50:07.781907414Z" level=info msg="Start subscribing containerd event" Sep 4 17:50:07.782100 containerd[1564]: time="2025-09-04T17:50:07.781975181Z" level=info msg="Start recovering state" Sep 4 17:50:07.782100 containerd[1564]: time="2025-09-04T17:50:07.782128860Z" level=info msg="Start event monitor" Sep 4 17:50:07.782188 containerd[1564]: time="2025-09-04T17:50:07.782144088Z" level=info msg="Start cni network conf syncer for default" Sep 4 17:50:07.782188 containerd[1564]: time="2025-09-04T17:50:07.782161571Z" level=info msg="Start streaming server" Sep 4 17:50:07.782188 containerd[1564]: time="2025-09-04T17:50:07.782180016Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 4 17:50:07.782188 containerd[1564]: time="2025-09-04T17:50:07.782187059Z" level=info msg="runtime interface starting up..." Sep 4 17:50:07.783084 containerd[1564]: time="2025-09-04T17:50:07.782194172Z" level=info msg="starting plugins..." Sep 4 17:50:07.783084 containerd[1564]: time="2025-09-04T17:50:07.782209932Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 4 17:50:07.783084 containerd[1564]: time="2025-09-04T17:50:07.782216454Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 4 17:50:07.783084 containerd[1564]: time="2025-09-04T17:50:07.782286836Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 4 17:50:07.783084 containerd[1564]: time="2025-09-04T17:50:07.782418132Z" level=info msg="containerd successfully booted in 0.148012s" Sep 4 17:50:07.782519 systemd[1]: Started containerd.service - containerd container runtime. Sep 4 17:50:08.675470 systemd-networkd[1492]: eth0: Gained IPv6LL Sep 4 17:50:08.679218 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 4 17:50:08.681090 systemd[1]: Reached target network-online.target - Network is Online. Sep 4 17:50:08.683746 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Sep 4 17:50:08.686266 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:50:08.688578 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 4 17:50:08.739327 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 4 17:50:08.741220 systemd[1]: coreos-metadata.service: Deactivated successfully. Sep 4 17:50:08.741496 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Sep 4 17:50:08.743822 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 4 17:50:10.076257 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:50:10.078066 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 4 17:50:10.079447 systemd[1]: Startup finished in 3.430s (kernel) + 6.601s (initrd) + 5.372s (userspace) = 15.404s. Sep 4 17:50:10.083513 (kubelet)[1678]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 17:50:10.663128 kubelet[1678]: E0904 17:50:10.663042 1678 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 17:50:10.667412 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 17:50:10.667617 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 17:50:10.668048 systemd[1]: kubelet.service: Consumed 1.785s CPU time, 264.9M memory peak. Sep 4 17:50:12.047743 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 4 17:50:12.049184 systemd[1]: Started sshd@0-10.0.0.60:22-10.0.0.1:45372.service - OpenSSH per-connection server daemon (10.0.0.1:45372). Sep 4 17:50:12.123436 sshd[1691]: Accepted publickey for core from 10.0.0.1 port 45372 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:50:12.125664 sshd-session[1691]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:50:12.132525 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 4 17:50:12.133666 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 4 17:50:12.140628 systemd-logind[1545]: New session 1 of user core. Sep 4 17:50:12.155277 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 4 17:50:12.158281 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 4 17:50:12.179679 (systemd)[1696]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 4 17:50:12.182211 systemd-logind[1545]: New session c1 of user core. Sep 4 17:50:12.334877 systemd[1696]: Queued start job for default target default.target. Sep 4 17:50:12.353478 systemd[1696]: Created slice app.slice - User Application Slice. Sep 4 17:50:12.353508 systemd[1696]: Reached target paths.target - Paths. Sep 4 17:50:12.353566 systemd[1696]: Reached target timers.target - Timers. Sep 4 17:50:12.355254 systemd[1696]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 4 17:50:12.368747 systemd[1696]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 4 17:50:12.368910 systemd[1696]: Reached target sockets.target - Sockets. Sep 4 17:50:12.368966 systemd[1696]: Reached target basic.target - Basic System. Sep 4 17:50:12.369024 systemd[1696]: Reached target default.target - Main User Target. Sep 4 17:50:12.369071 systemd[1696]: Startup finished in 179ms. Sep 4 17:50:12.369546 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 4 17:50:12.371362 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 4 17:50:12.435739 systemd[1]: Started sshd@1-10.0.0.60:22-10.0.0.1:45374.service - OpenSSH per-connection server daemon (10.0.0.1:45374). Sep 4 17:50:12.484134 sshd[1707]: Accepted publickey for core from 10.0.0.1 port 45374 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:50:12.486312 sshd-session[1707]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:50:12.491267 systemd-logind[1545]: New session 2 of user core. Sep 4 17:50:12.509340 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 4 17:50:12.563249 sshd[1710]: Connection closed by 10.0.0.1 port 45374 Sep 4 17:50:12.563685 sshd-session[1707]: pam_unix(sshd:session): session closed for user core Sep 4 17:50:12.575831 systemd[1]: sshd@1-10.0.0.60:22-10.0.0.1:45374.service: Deactivated successfully. Sep 4 17:50:12.577656 systemd[1]: session-2.scope: Deactivated successfully. Sep 4 17:50:12.578544 systemd-logind[1545]: Session 2 logged out. Waiting for processes to exit. Sep 4 17:50:12.581265 systemd[1]: Started sshd@2-10.0.0.60:22-10.0.0.1:45380.service - OpenSSH per-connection server daemon (10.0.0.1:45380). Sep 4 17:50:12.581867 systemd-logind[1545]: Removed session 2. Sep 4 17:50:12.644210 sshd[1716]: Accepted publickey for core from 10.0.0.1 port 45380 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:50:12.645944 sshd-session[1716]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:50:12.650757 systemd-logind[1545]: New session 3 of user core. Sep 4 17:50:12.658249 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 4 17:50:12.708496 sshd[1719]: Connection closed by 10.0.0.1 port 45380 Sep 4 17:50:12.708860 sshd-session[1716]: pam_unix(sshd:session): session closed for user core Sep 4 17:50:12.719178 systemd[1]: sshd@2-10.0.0.60:22-10.0.0.1:45380.service: Deactivated successfully. Sep 4 17:50:12.721416 systemd[1]: session-3.scope: Deactivated successfully. Sep 4 17:50:12.722149 systemd-logind[1545]: Session 3 logged out. Waiting for processes to exit. Sep 4 17:50:12.724755 systemd[1]: Started sshd@3-10.0.0.60:22-10.0.0.1:45392.service - OpenSSH per-connection server daemon (10.0.0.1:45392). Sep 4 17:50:12.725387 systemd-logind[1545]: Removed session 3. Sep 4 17:50:12.783632 sshd[1725]: Accepted publickey for core from 10.0.0.1 port 45392 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:50:12.785017 sshd-session[1725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:50:12.789443 systemd-logind[1545]: New session 4 of user core. Sep 4 17:50:12.798253 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 4 17:50:12.855850 sshd[1729]: Connection closed by 10.0.0.1 port 45392 Sep 4 17:50:12.856376 sshd-session[1725]: pam_unix(sshd:session): session closed for user core Sep 4 17:50:12.871047 systemd[1]: sshd@3-10.0.0.60:22-10.0.0.1:45392.service: Deactivated successfully. Sep 4 17:50:12.873186 systemd[1]: session-4.scope: Deactivated successfully. Sep 4 17:50:12.873997 systemd-logind[1545]: Session 4 logged out. Waiting for processes to exit. Sep 4 17:50:12.877338 systemd[1]: Started sshd@4-10.0.0.60:22-10.0.0.1:45408.service - OpenSSH per-connection server daemon (10.0.0.1:45408). Sep 4 17:50:12.877986 systemd-logind[1545]: Removed session 4. Sep 4 17:50:12.939082 sshd[1735]: Accepted publickey for core from 10.0.0.1 port 45408 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:50:12.940470 sshd-session[1735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:50:12.944907 systemd-logind[1545]: New session 5 of user core. Sep 4 17:50:12.960242 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 4 17:50:13.017628 sudo[1739]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 4 17:50:13.017947 sudo[1739]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 17:50:13.037633 sudo[1739]: pam_unix(sudo:session): session closed for user root Sep 4 17:50:13.039459 sshd[1738]: Connection closed by 10.0.0.1 port 45408 Sep 4 17:50:13.039856 sshd-session[1735]: pam_unix(sshd:session): session closed for user core Sep 4 17:50:13.048724 systemd[1]: sshd@4-10.0.0.60:22-10.0.0.1:45408.service: Deactivated successfully. Sep 4 17:50:13.050683 systemd[1]: session-5.scope: Deactivated successfully. Sep 4 17:50:13.051444 systemd-logind[1545]: Session 5 logged out. Waiting for processes to exit. Sep 4 17:50:13.054387 systemd[1]: Started sshd@5-10.0.0.60:22-10.0.0.1:45424.service - OpenSSH per-connection server daemon (10.0.0.1:45424). Sep 4 17:50:13.055031 systemd-logind[1545]: Removed session 5. Sep 4 17:50:13.106879 sshd[1745]: Accepted publickey for core from 10.0.0.1 port 45424 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:50:13.108131 sshd-session[1745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:50:13.112454 systemd-logind[1545]: New session 6 of user core. Sep 4 17:50:13.122262 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 4 17:50:13.175408 sudo[1750]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 4 17:50:13.175725 sudo[1750]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 17:50:13.182540 sudo[1750]: pam_unix(sudo:session): session closed for user root Sep 4 17:50:13.189476 sudo[1749]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 4 17:50:13.189800 sudo[1749]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 17:50:13.199914 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 4 17:50:13.258709 augenrules[1772]: No rules Sep 4 17:50:13.260611 systemd[1]: audit-rules.service: Deactivated successfully. Sep 4 17:50:13.260919 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 4 17:50:13.262025 sudo[1749]: pam_unix(sudo:session): session closed for user root Sep 4 17:50:13.263562 sshd[1748]: Connection closed by 10.0.0.1 port 45424 Sep 4 17:50:13.263894 sshd-session[1745]: pam_unix(sshd:session): session closed for user core Sep 4 17:50:13.277132 systemd[1]: sshd@5-10.0.0.60:22-10.0.0.1:45424.service: Deactivated successfully. Sep 4 17:50:13.279083 systemd[1]: session-6.scope: Deactivated successfully. Sep 4 17:50:13.279870 systemd-logind[1545]: Session 6 logged out. Waiting for processes to exit. Sep 4 17:50:13.283004 systemd[1]: Started sshd@6-10.0.0.60:22-10.0.0.1:45428.service - OpenSSH per-connection server daemon (10.0.0.1:45428). Sep 4 17:50:13.283750 systemd-logind[1545]: Removed session 6. Sep 4 17:50:13.334838 sshd[1781]: Accepted publickey for core from 10.0.0.1 port 45428 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:50:13.335984 sshd-session[1781]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:50:13.340485 systemd-logind[1545]: New session 7 of user core. Sep 4 17:50:13.354292 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 4 17:50:13.409167 sudo[1785]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 4 17:50:13.409604 sudo[1785]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 4 17:50:13.715766 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 4 17:50:13.733418 (dockerd)[1806]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 4 17:50:13.960035 dockerd[1806]: time="2025-09-04T17:50:13.959972388Z" level=info msg="Starting up" Sep 4 17:50:13.960758 dockerd[1806]: time="2025-09-04T17:50:13.960738395Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 4 17:50:13.974485 dockerd[1806]: time="2025-09-04T17:50:13.974410690Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 4 17:50:14.333681 dockerd[1806]: time="2025-09-04T17:50:14.333559982Z" level=info msg="Loading containers: start." Sep 4 17:50:14.343129 kernel: Initializing XFRM netlink socket Sep 4 17:50:14.620140 systemd-networkd[1492]: docker0: Link UP Sep 4 17:50:14.625066 dockerd[1806]: time="2025-09-04T17:50:14.625018585Z" level=info msg="Loading containers: done." Sep 4 17:50:14.643456 dockerd[1806]: time="2025-09-04T17:50:14.643356802Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 4 17:50:14.643702 dockerd[1806]: time="2025-09-04T17:50:14.643484612Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 4 17:50:14.643702 dockerd[1806]: time="2025-09-04T17:50:14.643636727Z" level=info msg="Initializing buildkit" Sep 4 17:50:14.676251 dockerd[1806]: time="2025-09-04T17:50:14.676185890Z" level=info msg="Completed buildkit initialization" Sep 4 17:50:14.681660 dockerd[1806]: time="2025-09-04T17:50:14.681167174Z" level=info msg="Daemon has completed initialization" Sep 4 17:50:14.681660 dockerd[1806]: time="2025-09-04T17:50:14.681273202Z" level=info msg="API listen on /run/docker.sock" Sep 4 17:50:14.681938 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 4 17:50:15.437800 containerd[1564]: time="2025-09-04T17:50:15.437753784Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 4 17:50:15.982265 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2045674899.mount: Deactivated successfully. Sep 4 17:50:16.956521 containerd[1564]: time="2025-09-04T17:50:16.956456576Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:16.957299 containerd[1564]: time="2025-09-04T17:50:16.957242691Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=28079631" Sep 4 17:50:16.958408 containerd[1564]: time="2025-09-04T17:50:16.958359956Z" level=info msg="ImageCreate event name:\"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:16.960723 containerd[1564]: time="2025-09-04T17:50:16.960683605Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:16.961684 containerd[1564]: time="2025-09-04T17:50:16.961650739Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"28076431\" in 1.523852231s" Sep 4 17:50:16.961684 containerd[1564]: time="2025-09-04T17:50:16.961683791Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:b1963c5b49c1722b8f408deaf83aafca7f48f47fed0ed14e5c10e93cc55974a7\"" Sep 4 17:50:16.962440 containerd[1564]: time="2025-09-04T17:50:16.962239333Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 4 17:50:18.129150 containerd[1564]: time="2025-09-04T17:50:18.129062450Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:18.129919 containerd[1564]: time="2025-09-04T17:50:18.129885974Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=24714681" Sep 4 17:50:18.130964 containerd[1564]: time="2025-09-04T17:50:18.130935603Z" level=info msg="ImageCreate event name:\"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:18.133407 containerd[1564]: time="2025-09-04T17:50:18.133371061Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:18.134247 containerd[1564]: time="2025-09-04T17:50:18.134198203Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"26317875\" in 1.171921941s" Sep 4 17:50:18.134247 containerd[1564]: time="2025-09-04T17:50:18.134235352Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:200c1a99a6f2b9d3b0a6e9b7362663513589341e0e58bc3b953a373efa735dfd\"" Sep 4 17:50:18.134659 containerd[1564]: time="2025-09-04T17:50:18.134629041Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 4 17:50:19.492784 containerd[1564]: time="2025-09-04T17:50:19.492704893Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:19.493396 containerd[1564]: time="2025-09-04T17:50:19.493366244Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=18782427" Sep 4 17:50:19.494422 containerd[1564]: time="2025-09-04T17:50:19.494391857Z" level=info msg="ImageCreate event name:\"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:19.496845 containerd[1564]: time="2025-09-04T17:50:19.496772703Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:19.497924 containerd[1564]: time="2025-09-04T17:50:19.497893555Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"20385639\" in 1.363232034s" Sep 4 17:50:19.497985 containerd[1564]: time="2025-09-04T17:50:19.497940093Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:bcdd9599681a9460a5539177a986dbdaf880ac56eeb117ab94adb8f37889efba\"" Sep 4 17:50:19.498609 containerd[1564]: time="2025-09-04T17:50:19.498569594Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 4 17:50:20.600621 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3960365120.mount: Deactivated successfully. Sep 4 17:50:20.918073 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 4 17:50:20.920448 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:50:21.661915 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:50:21.689705 (kubelet)[2103]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 4 17:50:21.799905 containerd[1564]: time="2025-09-04T17:50:21.799811666Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:21.800721 containerd[1564]: time="2025-09-04T17:50:21.800663083Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=30384255" Sep 4 17:50:21.801741 containerd[1564]: time="2025-09-04T17:50:21.801711960Z" level=info msg="ImageCreate event name:\"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:21.805022 containerd[1564]: time="2025-09-04T17:50:21.804153730Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:21.805022 containerd[1564]: time="2025-09-04T17:50:21.804575862Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"30383274\" in 2.305970532s" Sep 4 17:50:21.805022 containerd[1564]: time="2025-09-04T17:50:21.804606810Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:507cc52f5f78c0cff25e904c76c18e6bfc90982e9cc2aa4dcb19033f21c3f679\"" Sep 4 17:50:21.805443 containerd[1564]: time="2025-09-04T17:50:21.805402613Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 4 17:50:21.832183 kubelet[2103]: E0904 17:50:21.832084 2103 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 4 17:50:21.838973 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 4 17:50:21.839240 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 4 17:50:21.839710 systemd[1]: kubelet.service: Consumed 323ms CPU time, 110.9M memory peak. Sep 4 17:50:22.483519 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1237133810.mount: Deactivated successfully. Sep 4 17:50:23.174196 containerd[1564]: time="2025-09-04T17:50:23.174094868Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:23.174886 containerd[1564]: time="2025-09-04T17:50:23.174814588Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 4 17:50:23.175934 containerd[1564]: time="2025-09-04T17:50:23.175896498Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:23.178506 containerd[1564]: time="2025-09-04T17:50:23.178459205Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:23.179295 containerd[1564]: time="2025-09-04T17:50:23.179263593Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.373830443s" Sep 4 17:50:23.179338 containerd[1564]: time="2025-09-04T17:50:23.179297267Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 4 17:50:23.179943 containerd[1564]: time="2025-09-04T17:50:23.179760245Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 4 17:50:23.773188 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1205902630.mount: Deactivated successfully. Sep 4 17:50:23.779246 containerd[1564]: time="2025-09-04T17:50:23.779192499Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 17:50:23.779922 containerd[1564]: time="2025-09-04T17:50:23.779815838Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 4 17:50:23.781023 containerd[1564]: time="2025-09-04T17:50:23.780988477Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 17:50:23.782888 containerd[1564]: time="2025-09-04T17:50:23.782856191Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 4 17:50:23.783409 containerd[1564]: time="2025-09-04T17:50:23.783360286Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 603.570435ms" Sep 4 17:50:23.783409 containerd[1564]: time="2025-09-04T17:50:23.783405010Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 4 17:50:23.783960 containerd[1564]: time="2025-09-04T17:50:23.783911400Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 4 17:50:24.387163 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2238759680.mount: Deactivated successfully. Sep 4 17:50:26.661312 containerd[1564]: time="2025-09-04T17:50:26.661233133Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:26.661901 containerd[1564]: time="2025-09-04T17:50:26.661838679Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910709" Sep 4 17:50:26.663070 containerd[1564]: time="2025-09-04T17:50:26.663012080Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:26.665743 containerd[1564]: time="2025-09-04T17:50:26.665687398Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:26.666801 containerd[1564]: time="2025-09-04T17:50:26.666748969Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.882799087s" Sep 4 17:50:26.666801 containerd[1564]: time="2025-09-04T17:50:26.666791028Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 4 17:50:29.071321 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:50:29.071554 systemd[1]: kubelet.service: Consumed 323ms CPU time, 110.9M memory peak. Sep 4 17:50:29.074280 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:50:29.385930 systemd[1]: Reload requested from client PID 2252 ('systemctl') (unit session-7.scope)... Sep 4 17:50:29.385950 systemd[1]: Reloading... Sep 4 17:50:29.488159 zram_generator::config[2298]: No configuration found. Sep 4 17:50:29.794119 systemd[1]: Reloading finished in 407 ms. Sep 4 17:50:29.857843 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 4 17:50:29.857946 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 4 17:50:29.858326 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:50:29.858372 systemd[1]: kubelet.service: Consumed 233ms CPU time, 98.2M memory peak. Sep 4 17:50:29.860289 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:50:30.039082 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:50:30.044531 (kubelet)[2343]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 17:50:30.091863 kubelet[2343]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 17:50:30.091863 kubelet[2343]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 4 17:50:30.091863 kubelet[2343]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 17:50:30.092401 kubelet[2343]: I0904 17:50:30.091916 2343 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 17:50:30.686950 kubelet[2343]: I0904 17:50:30.686861 2343 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 4 17:50:30.686950 kubelet[2343]: I0904 17:50:30.686916 2343 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 17:50:30.687360 kubelet[2343]: I0904 17:50:30.687316 2343 server.go:934] "Client rotation is on, will bootstrap in background" Sep 4 17:50:30.712587 kubelet[2343]: E0904 17:50:30.712516 2343 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.60:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.60:6443: connect: connection refused" logger="UnhandledError" Sep 4 17:50:30.716711 kubelet[2343]: I0904 17:50:30.716667 2343 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 17:50:30.725287 kubelet[2343]: I0904 17:50:30.725246 2343 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 17:50:30.734789 kubelet[2343]: I0904 17:50:30.734726 2343 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 17:50:30.735586 kubelet[2343]: I0904 17:50:30.735547 2343 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 4 17:50:30.735825 kubelet[2343]: I0904 17:50:30.735753 2343 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 17:50:30.736095 kubelet[2343]: I0904 17:50:30.735805 2343 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 17:50:30.736391 kubelet[2343]: I0904 17:50:30.736144 2343 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 17:50:30.736391 kubelet[2343]: I0904 17:50:30.736169 2343 container_manager_linux.go:300] "Creating device plugin manager" Sep 4 17:50:30.736479 kubelet[2343]: I0904 17:50:30.736439 2343 state_mem.go:36] "Initialized new in-memory state store" Sep 4 17:50:30.738652 kubelet[2343]: I0904 17:50:30.738578 2343 kubelet.go:408] "Attempting to sync node with API server" Sep 4 17:50:30.738652 kubelet[2343]: I0904 17:50:30.738622 2343 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 17:50:30.738875 kubelet[2343]: I0904 17:50:30.738678 2343 kubelet.go:314] "Adding apiserver pod source" Sep 4 17:50:30.738875 kubelet[2343]: I0904 17:50:30.738719 2343 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 17:50:30.742881 kubelet[2343]: I0904 17:50:30.742845 2343 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 4 17:50:30.743497 kubelet[2343]: I0904 17:50:30.743400 2343 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 4 17:50:30.744951 kubelet[2343]: W0904 17:50:30.744623 2343 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 4 17:50:30.747560 kubelet[2343]: I0904 17:50:30.747490 2343 server.go:1274] "Started kubelet" Sep 4 17:50:30.749083 kubelet[2343]: I0904 17:50:30.748419 2343 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 17:50:30.753469 kubelet[2343]: W0904 17:50:30.753325 2343 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.60:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.60:6443: connect: connection refused Sep 4 17:50:30.755534 kubelet[2343]: W0904 17:50:30.754803 2343 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.60:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.60:6443: connect: connection refused Sep 4 17:50:30.755534 kubelet[2343]: E0904 17:50:30.754898 2343 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.60:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.60:6443: connect: connection refused" logger="UnhandledError" Sep 4 17:50:30.756087 kubelet[2343]: I0904 17:50:30.756024 2343 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 17:50:30.756524 kubelet[2343]: I0904 17:50:30.756496 2343 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 17:50:30.756684 kubelet[2343]: E0904 17:50:30.756650 2343 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.60:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.60:6443: connect: connection refused" logger="UnhandledError" Sep 4 17:50:30.757154 kubelet[2343]: E0904 17:50:30.755088 2343 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.60:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.60:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.186225ad45229c48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-09-04 17:50:30.747380808 +0000 UTC m=+0.696598931,LastTimestamp:2025-09-04 17:50:30.747380808 +0000 UTC m=+0.696598931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Sep 4 17:50:30.757986 kubelet[2343]: I0904 17:50:30.757874 2343 server.go:449] "Adding debug handlers to kubelet server" Sep 4 17:50:30.758772 kubelet[2343]: I0904 17:50:30.758599 2343 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 17:50:30.759312 kubelet[2343]: I0904 17:50:30.759273 2343 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 17:50:30.760409 kubelet[2343]: E0904 17:50:30.760388 2343 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 17:50:30.760773 kubelet[2343]: I0904 17:50:30.760755 2343 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 4 17:50:30.761178 kubelet[2343]: I0904 17:50:30.761160 2343 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 4 17:50:30.761347 kubelet[2343]: I0904 17:50:30.761336 2343 reconciler.go:26] "Reconciler: start to sync state" Sep 4 17:50:30.761851 kubelet[2343]: I0904 17:50:30.761820 2343 factory.go:221] Registration of the systemd container factory successfully Sep 4 17:50:30.762007 kubelet[2343]: I0904 17:50:30.761970 2343 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 17:50:30.762097 kubelet[2343]: W0904 17:50:30.761987 2343 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.60:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.60:6443: connect: connection refused Sep 4 17:50:30.762853 kubelet[2343]: E0904 17:50:30.762208 2343 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.60:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.60:6443: connect: connection refused" logger="UnhandledError" Sep 4 17:50:30.762853 kubelet[2343]: E0904 17:50:30.762431 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:30.762853 kubelet[2343]: E0904 17:50:30.762491 2343 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.60:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.60:6443: connect: connection refused" interval="200ms" Sep 4 17:50:30.764002 kubelet[2343]: I0904 17:50:30.763972 2343 factory.go:221] Registration of the containerd container factory successfully Sep 4 17:50:30.779668 kubelet[2343]: I0904 17:50:30.779394 2343 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 4 17:50:30.779668 kubelet[2343]: I0904 17:50:30.779413 2343 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 4 17:50:30.779668 kubelet[2343]: I0904 17:50:30.779449 2343 state_mem.go:36] "Initialized new in-memory state store" Sep 4 17:50:30.782329 kubelet[2343]: I0904 17:50:30.782265 2343 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 17:50:30.784394 kubelet[2343]: I0904 17:50:30.784345 2343 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 17:50:30.784533 kubelet[2343]: I0904 17:50:30.784423 2343 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 4 17:50:30.784533 kubelet[2343]: I0904 17:50:30.784473 2343 kubelet.go:2321] "Starting kubelet main sync loop" Sep 4 17:50:30.784589 kubelet[2343]: E0904 17:50:30.784544 2343 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 17:50:30.862758 kubelet[2343]: E0904 17:50:30.862673 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:30.885067 kubelet[2343]: E0904 17:50:30.884985 2343 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 4 17:50:30.963537 kubelet[2343]: E0904 17:50:30.963335 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:30.964011 kubelet[2343]: E0904 17:50:30.963951 2343 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.60:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.60:6443: connect: connection refused" interval="400ms" Sep 4 17:50:31.063879 kubelet[2343]: E0904 17:50:31.063823 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:31.086132 kubelet[2343]: E0904 17:50:31.086022 2343 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 4 17:50:31.164561 kubelet[2343]: E0904 17:50:31.164485 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:31.265593 kubelet[2343]: E0904 17:50:31.265433 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:31.365562 kubelet[2343]: E0904 17:50:31.365483 2343 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.60:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.60:6443: connect: connection refused" interval="800ms" Sep 4 17:50:31.365562 kubelet[2343]: E0904 17:50:31.365537 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:31.418417 kubelet[2343]: W0904 17:50:31.418305 2343 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.60:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.60:6443: connect: connection refused Sep 4 17:50:31.418417 kubelet[2343]: E0904 17:50:31.418411 2343 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.60:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.60:6443: connect: connection refused" logger="UnhandledError" Sep 4 17:50:31.420265 kubelet[2343]: I0904 17:50:31.420219 2343 policy_none.go:49] "None policy: Start" Sep 4 17:50:31.421444 kubelet[2343]: I0904 17:50:31.421414 2343 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 4 17:50:31.421444 kubelet[2343]: I0904 17:50:31.421440 2343 state_mem.go:35] "Initializing new in-memory state store" Sep 4 17:50:31.430376 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 4 17:50:31.462616 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 4 17:50:31.466001 kubelet[2343]: E0904 17:50:31.465707 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:31.467394 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 4 17:50:31.486907 kubelet[2343]: E0904 17:50:31.486833 2343 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 4 17:50:31.489889 kubelet[2343]: I0904 17:50:31.489817 2343 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 17:50:31.490279 kubelet[2343]: I0904 17:50:31.490259 2343 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 17:50:31.490340 kubelet[2343]: I0904 17:50:31.490291 2343 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 17:50:31.490938 kubelet[2343]: I0904 17:50:31.490690 2343 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 17:50:31.492598 kubelet[2343]: E0904 17:50:31.492574 2343 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Sep 4 17:50:31.592644 kubelet[2343]: I0904 17:50:31.592501 2343 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 4 17:50:31.592932 kubelet[2343]: E0904 17:50:31.592906 2343 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.60:6443/api/v1/nodes\": dial tcp 10.0.0.60:6443: connect: connection refused" node="localhost" Sep 4 17:50:31.794737 kubelet[2343]: I0904 17:50:31.794683 2343 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 4 17:50:31.795233 kubelet[2343]: E0904 17:50:31.795179 2343 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.60:6443/api/v1/nodes\": dial tcp 10.0.0.60:6443: connect: connection refused" node="localhost" Sep 4 17:50:31.946572 kubelet[2343]: W0904 17:50:31.946385 2343 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.60:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.60:6443: connect: connection refused Sep 4 17:50:31.946572 kubelet[2343]: E0904 17:50:31.946452 2343 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.60:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.60:6443: connect: connection refused" logger="UnhandledError" Sep 4 17:50:32.166483 kubelet[2343]: E0904 17:50:32.166394 2343 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.60:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.60:6443: connect: connection refused" interval="1.6s" Sep 4 17:50:32.168239 kubelet[2343]: W0904 17:50:32.168199 2343 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.60:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.60:6443: connect: connection refused Sep 4 17:50:32.168299 kubelet[2343]: E0904 17:50:32.168241 2343 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.60:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.60:6443: connect: connection refused" logger="UnhandledError" Sep 4 17:50:32.197939 kubelet[2343]: I0904 17:50:32.197778 2343 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 4 17:50:32.198184 kubelet[2343]: E0904 17:50:32.198128 2343 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.60:6443/api/v1/nodes\": dial tcp 10.0.0.60:6443: connect: connection refused" node="localhost" Sep 4 17:50:32.305879 systemd[1]: Created slice kubepods-burstable-pod287ffdc808009eb4b157bbcec5660198.slice - libcontainer container kubepods-burstable-pod287ffdc808009eb4b157bbcec5660198.slice. Sep 4 17:50:32.324456 systemd[1]: Created slice kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice - libcontainer container kubepods-burstable-podfec3f691a145cb26ff55e4af388500b7.slice. Sep 4 17:50:32.343273 kubelet[2343]: W0904 17:50:32.343204 2343 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.60:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.60:6443: connect: connection refused Sep 4 17:50:32.343273 kubelet[2343]: E0904 17:50:32.343280 2343 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.60:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.60:6443: connect: connection refused" logger="UnhandledError" Sep 4 17:50:32.351311 systemd[1]: Created slice kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice - libcontainer container kubepods-burstable-pod5dc878868de11c6196259ae42039f4ff.slice. Sep 4 17:50:32.370972 kubelet[2343]: I0904 17:50:32.370899 2343 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 17:50:32.370972 kubelet[2343]: I0904 17:50:32.370955 2343 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/287ffdc808009eb4b157bbcec5660198-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"287ffdc808009eb4b157bbcec5660198\") " pod="kube-system/kube-apiserver-localhost" Sep 4 17:50:32.370972 kubelet[2343]: I0904 17:50:32.370982 2343 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/287ffdc808009eb4b157bbcec5660198-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"287ffdc808009eb4b157bbcec5660198\") " pod="kube-system/kube-apiserver-localhost" Sep 4 17:50:32.371215 kubelet[2343]: I0904 17:50:32.371007 2343 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 17:50:32.371215 kubelet[2343]: I0904 17:50:32.371062 2343 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 17:50:32.371215 kubelet[2343]: I0904 17:50:32.371189 2343 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 17:50:32.371296 kubelet[2343]: I0904 17:50:32.371237 2343 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 17:50:32.371296 kubelet[2343]: I0904 17:50:32.371277 2343 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 4 17:50:32.371344 kubelet[2343]: I0904 17:50:32.371306 2343 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/287ffdc808009eb4b157bbcec5660198-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"287ffdc808009eb4b157bbcec5660198\") " pod="kube-system/kube-apiserver-localhost" Sep 4 17:50:32.427639 kubelet[2343]: W0904 17:50:32.427557 2343 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.60:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.60:6443: connect: connection refused Sep 4 17:50:32.427639 kubelet[2343]: E0904 17:50:32.427638 2343 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.60:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.60:6443: connect: connection refused" logger="UnhandledError" Sep 4 17:50:32.621655 containerd[1564]: time="2025-09-04T17:50:32.621470114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:287ffdc808009eb4b157bbcec5660198,Namespace:kube-system,Attempt:0,}" Sep 4 17:50:32.648586 containerd[1564]: time="2025-09-04T17:50:32.648510384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,}" Sep 4 17:50:32.655245 containerd[1564]: time="2025-09-04T17:50:32.655168783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,}" Sep 4 17:50:32.770015 kubelet[2343]: E0904 17:50:32.769952 2343 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.60:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.60:6443: connect: connection refused" logger="UnhandledError" Sep 4 17:50:32.903526 containerd[1564]: time="2025-09-04T17:50:32.903400655Z" level=info msg="connecting to shim 6d97738bd49563791717dfa8825c7c56eaf272576de0ae131acad41ad9256052" address="unix:///run/containerd/s/da0fcb377563b691bf1551100769e3a2585c1bbfd4e612a6edb47b31390f2f2c" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:50:32.912255 containerd[1564]: time="2025-09-04T17:50:32.912181335Z" level=info msg="connecting to shim 0a038813a4b92c8b9b55fb16f03c281272910b287a64b574d9be3e846d1a8cb4" address="unix:///run/containerd/s/a24b0248553d7c7717b4426e4c2dc3603ff38588b0f7200bfbe8c0cf107e3a4c" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:50:32.930056 containerd[1564]: time="2025-09-04T17:50:32.930017491Z" level=info msg="connecting to shim 522a279e8326519b1da60523df65facb086985c43a74e6cf09fca71b6ea476cf" address="unix:///run/containerd/s/5976f0dbb20e00ee852e35535115ac1468e7d1718c8fde4094b5f8ac85adc182" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:50:32.950429 systemd[1]: Started cri-containerd-6d97738bd49563791717dfa8825c7c56eaf272576de0ae131acad41ad9256052.scope - libcontainer container 6d97738bd49563791717dfa8825c7c56eaf272576de0ae131acad41ad9256052. Sep 4 17:50:32.956607 systemd[1]: Started cri-containerd-0a038813a4b92c8b9b55fb16f03c281272910b287a64b574d9be3e846d1a8cb4.scope - libcontainer container 0a038813a4b92c8b9b55fb16f03c281272910b287a64b574d9be3e846d1a8cb4. Sep 4 17:50:32.972328 systemd[1]: Started cri-containerd-522a279e8326519b1da60523df65facb086985c43a74e6cf09fca71b6ea476cf.scope - libcontainer container 522a279e8326519b1da60523df65facb086985c43a74e6cf09fca71b6ea476cf. Sep 4 17:50:33.000307 kubelet[2343]: I0904 17:50:33.000265 2343 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 4 17:50:33.000728 kubelet[2343]: E0904 17:50:33.000663 2343 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.60:6443/api/v1/nodes\": dial tcp 10.0.0.60:6443: connect: connection refused" node="localhost" Sep 4 17:50:33.024187 containerd[1564]: time="2025-09-04T17:50:33.021861313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:fec3f691a145cb26ff55e4af388500b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"0a038813a4b92c8b9b55fb16f03c281272910b287a64b574d9be3e846d1a8cb4\"" Sep 4 17:50:33.028407 containerd[1564]: time="2025-09-04T17:50:33.028363149Z" level=info msg="CreateContainer within sandbox \"0a038813a4b92c8b9b55fb16f03c281272910b287a64b574d9be3e846d1a8cb4\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 4 17:50:33.034832 containerd[1564]: time="2025-09-04T17:50:33.034794332Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:287ffdc808009eb4b157bbcec5660198,Namespace:kube-system,Attempt:0,} returns sandbox id \"6d97738bd49563791717dfa8825c7c56eaf272576de0ae131acad41ad9256052\"" Sep 4 17:50:33.037616 containerd[1564]: time="2025-09-04T17:50:33.037588954Z" level=info msg="CreateContainer within sandbox \"6d97738bd49563791717dfa8825c7c56eaf272576de0ae131acad41ad9256052\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 4 17:50:33.056821 containerd[1564]: time="2025-09-04T17:50:33.055737065Z" level=info msg="Container 96fb9cd2463024ec8930d8bdd89c9fbce700b160f6d1b5bf3f5b465c63ef6a40: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:50:33.056821 containerd[1564]: time="2025-09-04T17:50:33.056009035Z" level=info msg="Container 1dc3cb7871587f49b639ecda61c3a88ce99216781707db21bc67c4934e43f90b: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:50:33.063574 containerd[1564]: time="2025-09-04T17:50:33.063534490Z" level=info msg="CreateContainer within sandbox \"6d97738bd49563791717dfa8825c7c56eaf272576de0ae131acad41ad9256052\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"96fb9cd2463024ec8930d8bdd89c9fbce700b160f6d1b5bf3f5b465c63ef6a40\"" Sep 4 17:50:33.064697 containerd[1564]: time="2025-09-04T17:50:33.064383062Z" level=info msg="StartContainer for \"96fb9cd2463024ec8930d8bdd89c9fbce700b160f6d1b5bf3f5b465c63ef6a40\"" Sep 4 17:50:33.065500 containerd[1564]: time="2025-09-04T17:50:33.065445735Z" level=info msg="connecting to shim 96fb9cd2463024ec8930d8bdd89c9fbce700b160f6d1b5bf3f5b465c63ef6a40" address="unix:///run/containerd/s/da0fcb377563b691bf1551100769e3a2585c1bbfd4e612a6edb47b31390f2f2c" protocol=ttrpc version=3 Sep 4 17:50:33.068267 containerd[1564]: time="2025-09-04T17:50:33.068233164Z" level=info msg="CreateContainer within sandbox \"0a038813a4b92c8b9b55fb16f03c281272910b287a64b574d9be3e846d1a8cb4\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"1dc3cb7871587f49b639ecda61c3a88ce99216781707db21bc67c4934e43f90b\"" Sep 4 17:50:33.069699 containerd[1564]: time="2025-09-04T17:50:33.069651865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:5dc878868de11c6196259ae42039f4ff,Namespace:kube-system,Attempt:0,} returns sandbox id \"522a279e8326519b1da60523df65facb086985c43a74e6cf09fca71b6ea476cf\"" Sep 4 17:50:33.069810 containerd[1564]: time="2025-09-04T17:50:33.069658878Z" level=info msg="StartContainer for \"1dc3cb7871587f49b639ecda61c3a88ce99216781707db21bc67c4934e43f90b\"" Sep 4 17:50:33.070829 containerd[1564]: time="2025-09-04T17:50:33.070799818Z" level=info msg="connecting to shim 1dc3cb7871587f49b639ecda61c3a88ce99216781707db21bc67c4934e43f90b" address="unix:///run/containerd/s/a24b0248553d7c7717b4426e4c2dc3603ff38588b0f7200bfbe8c0cf107e3a4c" protocol=ttrpc version=3 Sep 4 17:50:33.072164 containerd[1564]: time="2025-09-04T17:50:33.072127338Z" level=info msg="CreateContainer within sandbox \"522a279e8326519b1da60523df65facb086985c43a74e6cf09fca71b6ea476cf\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 4 17:50:33.082299 containerd[1564]: time="2025-09-04T17:50:33.081636645Z" level=info msg="Container c1e9a9e4d094f87451e81b99a9723b00b0ef99515a3768de05320b0308716ce5: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:50:33.086284 systemd[1]: Started cri-containerd-96fb9cd2463024ec8930d8bdd89c9fbce700b160f6d1b5bf3f5b465c63ef6a40.scope - libcontainer container 96fb9cd2463024ec8930d8bdd89c9fbce700b160f6d1b5bf3f5b465c63ef6a40. Sep 4 17:50:33.089199 containerd[1564]: time="2025-09-04T17:50:33.089161469Z" level=info msg="CreateContainer within sandbox \"522a279e8326519b1da60523df65facb086985c43a74e6cf09fca71b6ea476cf\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c1e9a9e4d094f87451e81b99a9723b00b0ef99515a3768de05320b0308716ce5\"" Sep 4 17:50:33.090853 containerd[1564]: time="2025-09-04T17:50:33.090828276Z" level=info msg="StartContainer for \"c1e9a9e4d094f87451e81b99a9723b00b0ef99515a3768de05320b0308716ce5\"" Sep 4 17:50:33.092071 containerd[1564]: time="2025-09-04T17:50:33.092044367Z" level=info msg="connecting to shim c1e9a9e4d094f87451e81b99a9723b00b0ef99515a3768de05320b0308716ce5" address="unix:///run/containerd/s/5976f0dbb20e00ee852e35535115ac1468e7d1718c8fde4094b5f8ac85adc182" protocol=ttrpc version=3 Sep 4 17:50:33.094394 systemd[1]: Started cri-containerd-1dc3cb7871587f49b639ecda61c3a88ce99216781707db21bc67c4934e43f90b.scope - libcontainer container 1dc3cb7871587f49b639ecda61c3a88ce99216781707db21bc67c4934e43f90b. Sep 4 17:50:33.116249 systemd[1]: Started cri-containerd-c1e9a9e4d094f87451e81b99a9723b00b0ef99515a3768de05320b0308716ce5.scope - libcontainer container c1e9a9e4d094f87451e81b99a9723b00b0ef99515a3768de05320b0308716ce5. Sep 4 17:50:33.177546 containerd[1564]: time="2025-09-04T17:50:33.177397445Z" level=info msg="StartContainer for \"1dc3cb7871587f49b639ecda61c3a88ce99216781707db21bc67c4934e43f90b\" returns successfully" Sep 4 17:50:33.192926 containerd[1564]: time="2025-09-04T17:50:33.192880146Z" level=info msg="StartContainer for \"c1e9a9e4d094f87451e81b99a9723b00b0ef99515a3768de05320b0308716ce5\" returns successfully" Sep 4 17:50:33.211378 containerd[1564]: time="2025-09-04T17:50:33.211324051Z" level=info msg="StartContainer for \"96fb9cd2463024ec8930d8bdd89c9fbce700b160f6d1b5bf3f5b465c63ef6a40\" returns successfully" Sep 4 17:50:34.551542 kubelet[2343]: E0904 17:50:34.551484 2343 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Sep 4 17:50:34.602712 kubelet[2343]: I0904 17:50:34.602666 2343 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 4 17:50:34.613250 kubelet[2343]: I0904 17:50:34.613209 2343 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 4 17:50:34.613250 kubelet[2343]: E0904 17:50:34.613243 2343 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Sep 4 17:50:34.620673 kubelet[2343]: E0904 17:50:34.620591 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:34.721280 kubelet[2343]: E0904 17:50:34.721220 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:34.822430 kubelet[2343]: E0904 17:50:34.822308 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:34.923296 kubelet[2343]: E0904 17:50:34.923250 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:35.023995 kubelet[2343]: E0904 17:50:35.023946 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:35.124892 kubelet[2343]: E0904 17:50:35.124790 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:35.225777 kubelet[2343]: E0904 17:50:35.225726 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:35.326767 kubelet[2343]: E0904 17:50:35.326720 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:35.427463 kubelet[2343]: E0904 17:50:35.427359 2343 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:35.743469 kubelet[2343]: I0904 17:50:35.743167 2343 apiserver.go:52] "Watching apiserver" Sep 4 17:50:35.762367 kubelet[2343]: I0904 17:50:35.762297 2343 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 4 17:50:36.530090 systemd[1]: Reload requested from client PID 2616 ('systemctl') (unit session-7.scope)... Sep 4 17:50:36.530134 systemd[1]: Reloading... Sep 4 17:50:36.614207 zram_generator::config[2665]: No configuration found. Sep 4 17:50:36.837203 systemd[1]: Reloading finished in 306 ms. Sep 4 17:50:36.866857 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:50:36.887129 systemd[1]: kubelet.service: Deactivated successfully. Sep 4 17:50:36.887493 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:50:36.887572 systemd[1]: kubelet.service: Consumed 1.216s CPU time, 131.7M memory peak. Sep 4 17:50:36.889790 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 4 17:50:37.133041 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 4 17:50:37.138342 (kubelet)[2704]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 4 17:50:37.183139 kubelet[2704]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 17:50:37.183139 kubelet[2704]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 4 17:50:37.183139 kubelet[2704]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 4 17:50:37.183139 kubelet[2704]: I0904 17:50:37.183061 2704 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 4 17:50:37.191337 kubelet[2704]: I0904 17:50:37.191290 2704 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 4 17:50:37.191337 kubelet[2704]: I0904 17:50:37.191313 2704 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 4 17:50:37.191537 kubelet[2704]: I0904 17:50:37.191529 2704 server.go:934] "Client rotation is on, will bootstrap in background" Sep 4 17:50:37.192785 kubelet[2704]: I0904 17:50:37.192758 2704 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 4 17:50:37.194795 kubelet[2704]: I0904 17:50:37.194744 2704 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 4 17:50:37.205050 kubelet[2704]: I0904 17:50:37.205020 2704 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 4 17:50:37.210481 kubelet[2704]: I0904 17:50:37.210443 2704 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 4 17:50:37.210631 kubelet[2704]: I0904 17:50:37.210612 2704 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 4 17:50:37.210785 kubelet[2704]: I0904 17:50:37.210755 2704 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 4 17:50:37.210945 kubelet[2704]: I0904 17:50:37.210780 2704 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 4 17:50:37.211045 kubelet[2704]: I0904 17:50:37.210953 2704 topology_manager.go:138] "Creating topology manager with none policy" Sep 4 17:50:37.211045 kubelet[2704]: I0904 17:50:37.210962 2704 container_manager_linux.go:300] "Creating device plugin manager" Sep 4 17:50:37.211045 kubelet[2704]: I0904 17:50:37.210993 2704 state_mem.go:36] "Initialized new in-memory state store" Sep 4 17:50:37.211135 kubelet[2704]: I0904 17:50:37.211127 2704 kubelet.go:408] "Attempting to sync node with API server" Sep 4 17:50:37.211159 kubelet[2704]: I0904 17:50:37.211141 2704 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 4 17:50:37.211184 kubelet[2704]: I0904 17:50:37.211173 2704 kubelet.go:314] "Adding apiserver pod source" Sep 4 17:50:37.211211 kubelet[2704]: I0904 17:50:37.211184 2704 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 4 17:50:37.211988 kubelet[2704]: I0904 17:50:37.211957 2704 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 4 17:50:37.212417 kubelet[2704]: I0904 17:50:37.212393 2704 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 4 17:50:37.212870 kubelet[2704]: I0904 17:50:37.212844 2704 server.go:1274] "Started kubelet" Sep 4 17:50:37.213449 kubelet[2704]: I0904 17:50:37.213283 2704 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 4 17:50:37.213678 kubelet[2704]: I0904 17:50:37.213604 2704 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 4 17:50:37.213678 kubelet[2704]: I0904 17:50:37.213671 2704 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 4 17:50:37.214750 kubelet[2704]: I0904 17:50:37.214615 2704 server.go:449] "Adding debug handlers to kubelet server" Sep 4 17:50:37.215706 kubelet[2704]: I0904 17:50:37.215551 2704 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 4 17:50:37.219936 kubelet[2704]: I0904 17:50:37.216864 2704 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 4 17:50:37.219936 kubelet[2704]: E0904 17:50:37.217775 2704 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 4 17:50:37.219936 kubelet[2704]: E0904 17:50:37.218528 2704 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Sep 4 17:50:37.219936 kubelet[2704]: I0904 17:50:37.218586 2704 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 4 17:50:37.219936 kubelet[2704]: I0904 17:50:37.218751 2704 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 4 17:50:37.219936 kubelet[2704]: I0904 17:50:37.218895 2704 reconciler.go:26] "Reconciler: start to sync state" Sep 4 17:50:37.220593 kubelet[2704]: I0904 17:50:37.220542 2704 factory.go:221] Registration of the systemd container factory successfully Sep 4 17:50:37.220708 kubelet[2704]: I0904 17:50:37.220684 2704 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 4 17:50:37.225721 kubelet[2704]: I0904 17:50:37.225260 2704 factory.go:221] Registration of the containerd container factory successfully Sep 4 17:50:37.237856 kubelet[2704]: I0904 17:50:37.237786 2704 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 4 17:50:37.239851 kubelet[2704]: I0904 17:50:37.239809 2704 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 4 17:50:37.239851 kubelet[2704]: I0904 17:50:37.239853 2704 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 4 17:50:37.239947 kubelet[2704]: I0904 17:50:37.239876 2704 kubelet.go:2321] "Starting kubelet main sync loop" Sep 4 17:50:37.240077 kubelet[2704]: E0904 17:50:37.240045 2704 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 4 17:50:37.266910 kubelet[2704]: I0904 17:50:37.266863 2704 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 4 17:50:37.266910 kubelet[2704]: I0904 17:50:37.266892 2704 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 4 17:50:37.267088 kubelet[2704]: I0904 17:50:37.266933 2704 state_mem.go:36] "Initialized new in-memory state store" Sep 4 17:50:37.267258 kubelet[2704]: I0904 17:50:37.267232 2704 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 4 17:50:37.267310 kubelet[2704]: I0904 17:50:37.267252 2704 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 4 17:50:37.267310 kubelet[2704]: I0904 17:50:37.267274 2704 policy_none.go:49] "None policy: Start" Sep 4 17:50:37.268091 kubelet[2704]: I0904 17:50:37.268072 2704 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 4 17:50:37.268091 kubelet[2704]: I0904 17:50:37.268094 2704 state_mem.go:35] "Initializing new in-memory state store" Sep 4 17:50:37.268245 kubelet[2704]: I0904 17:50:37.268226 2704 state_mem.go:75] "Updated machine memory state" Sep 4 17:50:37.273085 kubelet[2704]: I0904 17:50:37.273058 2704 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 4 17:50:37.273318 kubelet[2704]: I0904 17:50:37.273292 2704 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 4 17:50:37.273366 kubelet[2704]: I0904 17:50:37.273309 2704 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 4 17:50:37.274090 kubelet[2704]: I0904 17:50:37.273930 2704 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 4 17:50:37.348595 kubelet[2704]: E0904 17:50:37.348495 2704 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Sep 4 17:50:37.375117 kubelet[2704]: I0904 17:50:37.375058 2704 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Sep 4 17:50:37.420367 kubelet[2704]: I0904 17:50:37.420227 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/287ffdc808009eb4b157bbcec5660198-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"287ffdc808009eb4b157bbcec5660198\") " pod="kube-system/kube-apiserver-localhost" Sep 4 17:50:37.420367 kubelet[2704]: I0904 17:50:37.420269 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 17:50:37.420367 kubelet[2704]: I0904 17:50:37.420290 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 17:50:37.420367 kubelet[2704]: I0904 17:50:37.420312 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 17:50:37.420367 kubelet[2704]: I0904 17:50:37.420328 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/287ffdc808009eb4b157bbcec5660198-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"287ffdc808009eb4b157bbcec5660198\") " pod="kube-system/kube-apiserver-localhost" Sep 4 17:50:37.420602 kubelet[2704]: I0904 17:50:37.420404 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 17:50:37.420602 kubelet[2704]: I0904 17:50:37.420449 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fec3f691a145cb26ff55e4af388500b7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"fec3f691a145cb26ff55e4af388500b7\") " pod="kube-system/kube-controller-manager-localhost" Sep 4 17:50:37.420602 kubelet[2704]: I0904 17:50:37.420470 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dc878868de11c6196259ae42039f4ff-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"5dc878868de11c6196259ae42039f4ff\") " pod="kube-system/kube-scheduler-localhost" Sep 4 17:50:37.420602 kubelet[2704]: I0904 17:50:37.420484 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/287ffdc808009eb4b157bbcec5660198-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"287ffdc808009eb4b157bbcec5660198\") " pod="kube-system/kube-apiserver-localhost" Sep 4 17:50:37.645921 kubelet[2704]: I0904 17:50:37.645876 2704 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Sep 4 17:50:37.646097 kubelet[2704]: I0904 17:50:37.645990 2704 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Sep 4 17:50:38.212100 kubelet[2704]: I0904 17:50:38.212033 2704 apiserver.go:52] "Watching apiserver" Sep 4 17:50:38.219628 kubelet[2704]: I0904 17:50:38.219565 2704 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 4 17:50:38.288970 kubelet[2704]: I0904 17:50:38.288883 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.288858224 podStartE2EDuration="1.288858224s" podCreationTimestamp="2025-09-04 17:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 17:50:38.288561838 +0000 UTC m=+1.145725695" watchObservedRunningTime="2025-09-04 17:50:38.288858224 +0000 UTC m=+1.146022081" Sep 4 17:50:38.289360 kubelet[2704]: I0904 17:50:38.289014 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=2.289008947 podStartE2EDuration="2.289008947s" podCreationTimestamp="2025-09-04 17:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 17:50:38.27973412 +0000 UTC m=+1.136897987" watchObservedRunningTime="2025-09-04 17:50:38.289008947 +0000 UTC m=+1.146172804" Sep 4 17:50:38.296379 kubelet[2704]: I0904 17:50:38.296183 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.296164158 podStartE2EDuration="1.296164158s" podCreationTimestamp="2025-09-04 17:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 17:50:38.295128906 +0000 UTC m=+1.152292763" watchObservedRunningTime="2025-09-04 17:50:38.296164158 +0000 UTC m=+1.153328015" Sep 4 17:50:43.040801 kubelet[2704]: I0904 17:50:43.040757 2704 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 4 17:50:43.041276 kubelet[2704]: I0904 17:50:43.041252 2704 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 4 17:50:43.041312 containerd[1564]: time="2025-09-04T17:50:43.041059281Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 4 17:50:43.724856 systemd[1]: Created slice kubepods-besteffort-pod43464bfb_808b_49e3_b3c0_bda970c4604c.slice - libcontainer container kubepods-besteffort-pod43464bfb_808b_49e3_b3c0_bda970c4604c.slice. Sep 4 17:50:43.790461 kubelet[2704]: I0904 17:50:43.790403 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/43464bfb-808b-49e3-b3c0-bda970c4604c-kube-proxy\") pod \"kube-proxy-gxnkm\" (UID: \"43464bfb-808b-49e3-b3c0-bda970c4604c\") " pod="kube-system/kube-proxy-gxnkm" Sep 4 17:50:43.790849 kubelet[2704]: I0904 17:50:43.790723 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/43464bfb-808b-49e3-b3c0-bda970c4604c-xtables-lock\") pod \"kube-proxy-gxnkm\" (UID: \"43464bfb-808b-49e3-b3c0-bda970c4604c\") " pod="kube-system/kube-proxy-gxnkm" Sep 4 17:50:43.790849 kubelet[2704]: I0904 17:50:43.790750 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/43464bfb-808b-49e3-b3c0-bda970c4604c-lib-modules\") pod \"kube-proxy-gxnkm\" (UID: \"43464bfb-808b-49e3-b3c0-bda970c4604c\") " pod="kube-system/kube-proxy-gxnkm" Sep 4 17:50:43.790849 kubelet[2704]: I0904 17:50:43.790770 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6phw\" (UniqueName: \"kubernetes.io/projected/43464bfb-808b-49e3-b3c0-bda970c4604c-kube-api-access-f6phw\") pod \"kube-proxy-gxnkm\" (UID: \"43464bfb-808b-49e3-b3c0-bda970c4604c\") " pod="kube-system/kube-proxy-gxnkm" Sep 4 17:50:43.989952 systemd[1]: Created slice kubepods-besteffort-pod8b01b0c7_a9a0_4cab_a476_821284268d64.slice - libcontainer container kubepods-besteffort-pod8b01b0c7_a9a0_4cab_a476_821284268d64.slice. Sep 4 17:50:44.036456 containerd[1564]: time="2025-09-04T17:50:44.036422658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gxnkm,Uid:43464bfb-808b-49e3-b3c0-bda970c4604c,Namespace:kube-system,Attempt:0,}" Sep 4 17:50:44.092742 kubelet[2704]: I0904 17:50:44.092665 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zrrn\" (UniqueName: \"kubernetes.io/projected/8b01b0c7-a9a0-4cab-a476-821284268d64-kube-api-access-4zrrn\") pod \"tigera-operator-58fc44c59b-mz4hn\" (UID: \"8b01b0c7-a9a0-4cab-a476-821284268d64\") " pod="tigera-operator/tigera-operator-58fc44c59b-mz4hn" Sep 4 17:50:44.093227 kubelet[2704]: I0904 17:50:44.092819 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8b01b0c7-a9a0-4cab-a476-821284268d64-var-lib-calico\") pod \"tigera-operator-58fc44c59b-mz4hn\" (UID: \"8b01b0c7-a9a0-4cab-a476-821284268d64\") " pod="tigera-operator/tigera-operator-58fc44c59b-mz4hn" Sep 4 17:50:44.134882 containerd[1564]: time="2025-09-04T17:50:44.134825507Z" level=info msg="connecting to shim 2423f319b6e357be2a77fbb304e4fef1491f90462ad9040834772edd50343434" address="unix:///run/containerd/s/4f52ea965b0b811a87282aeb2239c7bac26f100195f39a40f69f035d295f1c0c" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:50:44.173258 systemd[1]: Started cri-containerd-2423f319b6e357be2a77fbb304e4fef1491f90462ad9040834772edd50343434.scope - libcontainer container 2423f319b6e357be2a77fbb304e4fef1491f90462ad9040834772edd50343434. Sep 4 17:50:44.200113 containerd[1564]: time="2025-09-04T17:50:44.200052321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gxnkm,Uid:43464bfb-808b-49e3-b3c0-bda970c4604c,Namespace:kube-system,Attempt:0,} returns sandbox id \"2423f319b6e357be2a77fbb304e4fef1491f90462ad9040834772edd50343434\"" Sep 4 17:50:44.204199 containerd[1564]: time="2025-09-04T17:50:44.204165412Z" level=info msg="CreateContainer within sandbox \"2423f319b6e357be2a77fbb304e4fef1491f90462ad9040834772edd50343434\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 4 17:50:44.216996 containerd[1564]: time="2025-09-04T17:50:44.216939287Z" level=info msg="Container 6d97d1c82e5afc85a5d65722552b5375dcc9022c5e7bd7161090feba1e0b1a28: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:50:44.225262 containerd[1564]: time="2025-09-04T17:50:44.225230933Z" level=info msg="CreateContainer within sandbox \"2423f319b6e357be2a77fbb304e4fef1491f90462ad9040834772edd50343434\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"6d97d1c82e5afc85a5d65722552b5375dcc9022c5e7bd7161090feba1e0b1a28\"" Sep 4 17:50:44.225732 containerd[1564]: time="2025-09-04T17:50:44.225712316Z" level=info msg="StartContainer for \"6d97d1c82e5afc85a5d65722552b5375dcc9022c5e7bd7161090feba1e0b1a28\"" Sep 4 17:50:44.229945 containerd[1564]: time="2025-09-04T17:50:44.229907033Z" level=info msg="connecting to shim 6d97d1c82e5afc85a5d65722552b5375dcc9022c5e7bd7161090feba1e0b1a28" address="unix:///run/containerd/s/4f52ea965b0b811a87282aeb2239c7bac26f100195f39a40f69f035d295f1c0c" protocol=ttrpc version=3 Sep 4 17:50:44.251246 systemd[1]: Started cri-containerd-6d97d1c82e5afc85a5d65722552b5375dcc9022c5e7bd7161090feba1e0b1a28.scope - libcontainer container 6d97d1c82e5afc85a5d65722552b5375dcc9022c5e7bd7161090feba1e0b1a28. Sep 4 17:50:44.295216 containerd[1564]: time="2025-09-04T17:50:44.295176158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-mz4hn,Uid:8b01b0c7-a9a0-4cab-a476-821284268d64,Namespace:tigera-operator,Attempt:0,}" Sep 4 17:50:44.300558 containerd[1564]: time="2025-09-04T17:50:44.300530396Z" level=info msg="StartContainer for \"6d97d1c82e5afc85a5d65722552b5375dcc9022c5e7bd7161090feba1e0b1a28\" returns successfully" Sep 4 17:50:44.329257 containerd[1564]: time="2025-09-04T17:50:44.329193345Z" level=info msg="connecting to shim db1fbc1d9a49fd008b993e9e3429b828794709b84081d87f6a99c71884143b71" address="unix:///run/containerd/s/085d5c8a0a30570b2779887414eb3264c9d9c62ae3888a28afb557affb43095c" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:50:44.356253 systemd[1]: Started cri-containerd-db1fbc1d9a49fd008b993e9e3429b828794709b84081d87f6a99c71884143b71.scope - libcontainer container db1fbc1d9a49fd008b993e9e3429b828794709b84081d87f6a99c71884143b71. Sep 4 17:50:44.406738 containerd[1564]: time="2025-09-04T17:50:44.406685539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-mz4hn,Uid:8b01b0c7-a9a0-4cab-a476-821284268d64,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"db1fbc1d9a49fd008b993e9e3429b828794709b84081d87f6a99c71884143b71\"" Sep 4 17:50:44.408419 containerd[1564]: time="2025-09-04T17:50:44.408383201Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 4 17:50:44.903078 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount568661046.mount: Deactivated successfully. Sep 4 17:50:45.848262 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3963744876.mount: Deactivated successfully. Sep 4 17:50:46.294052 containerd[1564]: time="2025-09-04T17:50:46.293909583Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:46.294709 containerd[1564]: time="2025-09-04T17:50:46.294648945Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 4 17:50:46.295793 containerd[1564]: time="2025-09-04T17:50:46.295746563Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:46.297896 containerd[1564]: time="2025-09-04T17:50:46.297858790Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:46.298500 containerd[1564]: time="2025-09-04T17:50:46.298465069Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.890038174s" Sep 4 17:50:46.298500 containerd[1564]: time="2025-09-04T17:50:46.298497190Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 4 17:50:46.300396 containerd[1564]: time="2025-09-04T17:50:46.300359008Z" level=info msg="CreateContainer within sandbox \"db1fbc1d9a49fd008b993e9e3429b828794709b84081d87f6a99c71884143b71\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 4 17:50:46.309449 containerd[1564]: time="2025-09-04T17:50:46.309408102Z" level=info msg="Container b3ca802ddd3a8c0d837585ebf3f821f4862d491d3f232eab0002bbceec24f335: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:50:46.314473 containerd[1564]: time="2025-09-04T17:50:46.314434498Z" level=info msg="CreateContainer within sandbox \"db1fbc1d9a49fd008b993e9e3429b828794709b84081d87f6a99c71884143b71\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"b3ca802ddd3a8c0d837585ebf3f821f4862d491d3f232eab0002bbceec24f335\"" Sep 4 17:50:46.315021 containerd[1564]: time="2025-09-04T17:50:46.314876573Z" level=info msg="StartContainer for \"b3ca802ddd3a8c0d837585ebf3f821f4862d491d3f232eab0002bbceec24f335\"" Sep 4 17:50:46.315751 containerd[1564]: time="2025-09-04T17:50:46.315730655Z" level=info msg="connecting to shim b3ca802ddd3a8c0d837585ebf3f821f4862d491d3f232eab0002bbceec24f335" address="unix:///run/containerd/s/085d5c8a0a30570b2779887414eb3264c9d9c62ae3888a28afb557affb43095c" protocol=ttrpc version=3 Sep 4 17:50:46.368281 systemd[1]: Started cri-containerd-b3ca802ddd3a8c0d837585ebf3f821f4862d491d3f232eab0002bbceec24f335.scope - libcontainer container b3ca802ddd3a8c0d837585ebf3f821f4862d491d3f232eab0002bbceec24f335. Sep 4 17:50:46.400841 containerd[1564]: time="2025-09-04T17:50:46.400795434Z" level=info msg="StartContainer for \"b3ca802ddd3a8c0d837585ebf3f821f4862d491d3f232eab0002bbceec24f335\" returns successfully" Sep 4 17:50:47.287081 kubelet[2704]: I0904 17:50:47.286844 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-gxnkm" podStartSLOduration=4.286822852 podStartE2EDuration="4.286822852s" podCreationTimestamp="2025-09-04 17:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 17:50:45.278747134 +0000 UTC m=+8.135910991" watchObservedRunningTime="2025-09-04 17:50:47.286822852 +0000 UTC m=+10.143986709" Sep 4 17:50:47.287081 kubelet[2704]: I0904 17:50:47.286951 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-mz4hn" podStartSLOduration=2.395693822 podStartE2EDuration="4.286946278s" podCreationTimestamp="2025-09-04 17:50:43 +0000 UTC" firstStartedPulling="2025-09-04 17:50:44.407923431 +0000 UTC m=+7.265087278" lastFinishedPulling="2025-09-04 17:50:46.299175877 +0000 UTC m=+9.156339734" observedRunningTime="2025-09-04 17:50:47.28691049 +0000 UTC m=+10.144074347" watchObservedRunningTime="2025-09-04 17:50:47.286946278 +0000 UTC m=+10.144110135" Sep 4 17:50:51.456398 sudo[1785]: pam_unix(sudo:session): session closed for user root Sep 4 17:50:51.462335 sshd[1784]: Connection closed by 10.0.0.1 port 45428 Sep 4 17:50:51.462006 sshd-session[1781]: pam_unix(sshd:session): session closed for user core Sep 4 17:50:51.469224 systemd-logind[1545]: Session 7 logged out. Waiting for processes to exit. Sep 4 17:50:51.469940 systemd[1]: sshd@6-10.0.0.60:22-10.0.0.1:45428.service: Deactivated successfully. Sep 4 17:50:51.474055 systemd[1]: session-7.scope: Deactivated successfully. Sep 4 17:50:51.475447 systemd[1]: session-7.scope: Consumed 5.235s CPU time, 224.7M memory peak. Sep 4 17:50:51.481706 systemd-logind[1545]: Removed session 7. Sep 4 17:50:52.676430 update_engine[1550]: I20250904 17:50:52.676259 1550 update_attempter.cc:509] Updating boot flags... Sep 4 17:50:53.995850 systemd[1]: Created slice kubepods-besteffort-poda421b732_480d_41fa_9809_d836f3d662df.slice - libcontainer container kubepods-besteffort-poda421b732_480d_41fa_9809_d836f3d662df.slice. Sep 4 17:50:54.163077 kubelet[2704]: I0904 17:50:54.163017 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wwk5\" (UniqueName: \"kubernetes.io/projected/a421b732-480d-41fa-9809-d836f3d662df-kube-api-access-6wwk5\") pod \"calico-typha-797b89db9c-7zkkl\" (UID: \"a421b732-480d-41fa-9809-d836f3d662df\") " pod="calico-system/calico-typha-797b89db9c-7zkkl" Sep 4 17:50:54.163077 kubelet[2704]: I0904 17:50:54.163067 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a421b732-480d-41fa-9809-d836f3d662df-tigera-ca-bundle\") pod \"calico-typha-797b89db9c-7zkkl\" (UID: \"a421b732-480d-41fa-9809-d836f3d662df\") " pod="calico-system/calico-typha-797b89db9c-7zkkl" Sep 4 17:50:54.163617 kubelet[2704]: I0904 17:50:54.163146 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a421b732-480d-41fa-9809-d836f3d662df-typha-certs\") pod \"calico-typha-797b89db9c-7zkkl\" (UID: \"a421b732-480d-41fa-9809-d836f3d662df\") " pod="calico-system/calico-typha-797b89db9c-7zkkl" Sep 4 17:50:54.304979 containerd[1564]: time="2025-09-04T17:50:54.304861889Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-797b89db9c-7zkkl,Uid:a421b732-480d-41fa-9809-d836f3d662df,Namespace:calico-system,Attempt:0,}" Sep 4 17:50:54.349010 containerd[1564]: time="2025-09-04T17:50:54.348495706Z" level=info msg="connecting to shim ec378020dbf4f9bd9c6c49a6178d6abddd11494883c9c171142a3f0de2f18f4f" address="unix:///run/containerd/s/5cbdfd46b36a0c98d7df8de017093cafac7d10af7af80e415b9816819cf5d7a9" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:50:54.387281 systemd[1]: Started cri-containerd-ec378020dbf4f9bd9c6c49a6178d6abddd11494883c9c171142a3f0de2f18f4f.scope - libcontainer container ec378020dbf4f9bd9c6c49a6178d6abddd11494883c9c171142a3f0de2f18f4f. Sep 4 17:50:54.397769 systemd[1]: Created slice kubepods-besteffort-pod95bb4b8f_568d_4d78_93c9_b37d42ec5842.slice - libcontainer container kubepods-besteffort-pod95bb4b8f_568d_4d78_93c9_b37d42ec5842.slice. Sep 4 17:50:54.451042 containerd[1564]: time="2025-09-04T17:50:54.450986645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-797b89db9c-7zkkl,Uid:a421b732-480d-41fa-9809-d836f3d662df,Namespace:calico-system,Attempt:0,} returns sandbox id \"ec378020dbf4f9bd9c6c49a6178d6abddd11494883c9c171142a3f0de2f18f4f\"" Sep 4 17:50:54.455088 containerd[1564]: time="2025-09-04T17:50:54.453656409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 4 17:50:54.565276 kubelet[2704]: I0904 17:50:54.564906 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/95bb4b8f-568d-4d78-93c9-b37d42ec5842-var-run-calico\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.565276 kubelet[2704]: I0904 17:50:54.564950 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/95bb4b8f-568d-4d78-93c9-b37d42ec5842-cni-bin-dir\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.565276 kubelet[2704]: I0904 17:50:54.564966 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/95bb4b8f-568d-4d78-93c9-b37d42ec5842-cni-net-dir\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.565276 kubelet[2704]: I0904 17:50:54.564981 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/95bb4b8f-568d-4d78-93c9-b37d42ec5842-xtables-lock\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.565276 kubelet[2704]: I0904 17:50:54.564996 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95bb4b8f-568d-4d78-93c9-b37d42ec5842-lib-modules\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.565527 kubelet[2704]: I0904 17:50:54.565011 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/95bb4b8f-568d-4d78-93c9-b37d42ec5842-var-lib-calico\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.565527 kubelet[2704]: I0904 17:50:54.565030 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/95bb4b8f-568d-4d78-93c9-b37d42ec5842-flexvol-driver-host\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.565527 kubelet[2704]: I0904 17:50:54.565044 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msblk\" (UniqueName: \"kubernetes.io/projected/95bb4b8f-568d-4d78-93c9-b37d42ec5842-kube-api-access-msblk\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.565527 kubelet[2704]: I0904 17:50:54.565086 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/95bb4b8f-568d-4d78-93c9-b37d42ec5842-policysync\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.565527 kubelet[2704]: I0904 17:50:54.565122 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95bb4b8f-568d-4d78-93c9-b37d42ec5842-tigera-ca-bundle\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.565652 kubelet[2704]: I0904 17:50:54.565138 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/95bb4b8f-568d-4d78-93c9-b37d42ec5842-cni-log-dir\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.565652 kubelet[2704]: I0904 17:50:54.565169 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/95bb4b8f-568d-4d78-93c9-b37d42ec5842-node-certs\") pod \"calico-node-cxlkt\" (UID: \"95bb4b8f-568d-4d78-93c9-b37d42ec5842\") " pod="calico-system/calico-node-cxlkt" Sep 4 17:50:54.625047 kubelet[2704]: E0904 17:50:54.624981 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8mncz" podUID="81fd4b06-1bc3-423c-9205-8768e1e4b44a" Sep 4 17:50:54.665486 kubelet[2704]: I0904 17:50:54.665431 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/81fd4b06-1bc3-423c-9205-8768e1e4b44a-kubelet-dir\") pod \"csi-node-driver-8mncz\" (UID: \"81fd4b06-1bc3-423c-9205-8768e1e4b44a\") " pod="calico-system/csi-node-driver-8mncz" Sep 4 17:50:54.665486 kubelet[2704]: I0904 17:50:54.665464 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/81fd4b06-1bc3-423c-9205-8768e1e4b44a-registration-dir\") pod \"csi-node-driver-8mncz\" (UID: \"81fd4b06-1bc3-423c-9205-8768e1e4b44a\") " pod="calico-system/csi-node-driver-8mncz" Sep 4 17:50:54.665486 kubelet[2704]: I0904 17:50:54.665495 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/81fd4b06-1bc3-423c-9205-8768e1e4b44a-socket-dir\") pod \"csi-node-driver-8mncz\" (UID: \"81fd4b06-1bc3-423c-9205-8768e1e4b44a\") " pod="calico-system/csi-node-driver-8mncz" Sep 4 17:50:54.665730 kubelet[2704]: I0904 17:50:54.665568 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/81fd4b06-1bc3-423c-9205-8768e1e4b44a-varrun\") pod \"csi-node-driver-8mncz\" (UID: \"81fd4b06-1bc3-423c-9205-8768e1e4b44a\") " pod="calico-system/csi-node-driver-8mncz" Sep 4 17:50:54.665730 kubelet[2704]: I0904 17:50:54.665590 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2b4l\" (UniqueName: \"kubernetes.io/projected/81fd4b06-1bc3-423c-9205-8768e1e4b44a-kube-api-access-h2b4l\") pod \"csi-node-driver-8mncz\" (UID: \"81fd4b06-1bc3-423c-9205-8768e1e4b44a\") " pod="calico-system/csi-node-driver-8mncz" Sep 4 17:50:54.667303 kubelet[2704]: E0904 17:50:54.667270 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.667371 kubelet[2704]: W0904 17:50:54.667301 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.667371 kubelet[2704]: E0904 17:50:54.667362 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.668303 kubelet[2704]: E0904 17:50:54.668261 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.668303 kubelet[2704]: W0904 17:50:54.668279 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.668544 kubelet[2704]: E0904 17:50:54.668430 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.669331 kubelet[2704]: E0904 17:50:54.669313 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.669331 kubelet[2704]: W0904 17:50:54.669328 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.669522 kubelet[2704]: E0904 17:50:54.669472 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.669615 kubelet[2704]: E0904 17:50:54.669590 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.669662 kubelet[2704]: W0904 17:50:54.669614 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.669713 kubelet[2704]: E0904 17:50:54.669681 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.670098 kubelet[2704]: E0904 17:50:54.670033 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.670098 kubelet[2704]: W0904 17:50:54.670052 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.670347 kubelet[2704]: E0904 17:50:54.670277 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.670817 kubelet[2704]: E0904 17:50:54.670795 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.670817 kubelet[2704]: W0904 17:50:54.670816 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.670903 kubelet[2704]: E0904 17:50:54.670841 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.671095 kubelet[2704]: E0904 17:50:54.671068 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.671095 kubelet[2704]: W0904 17:50:54.671082 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.671231 kubelet[2704]: E0904 17:50:54.671151 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.673248 kubelet[2704]: E0904 17:50:54.672719 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.673248 kubelet[2704]: W0904 17:50:54.672734 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.673248 kubelet[2704]: E0904 17:50:54.672754 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.673248 kubelet[2704]: E0904 17:50:54.672983 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.673248 kubelet[2704]: W0904 17:50:54.672992 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.673248 kubelet[2704]: E0904 17:50:54.673034 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.673939 kubelet[2704]: E0904 17:50:54.673818 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.673939 kubelet[2704]: W0904 17:50:54.673834 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.673939 kubelet[2704]: E0904 17:50:54.673846 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.676988 kubelet[2704]: E0904 17:50:54.676963 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.676988 kubelet[2704]: W0904 17:50:54.676979 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.676988 kubelet[2704]: E0904 17:50:54.676990 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.702596 containerd[1564]: time="2025-09-04T17:50:54.702545361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cxlkt,Uid:95bb4b8f-568d-4d78-93c9-b37d42ec5842,Namespace:calico-system,Attempt:0,}" Sep 4 17:50:54.732058 containerd[1564]: time="2025-09-04T17:50:54.731980825Z" level=info msg="connecting to shim a4c3203b0bc346b65c5172fe08bc6c39128a323fcad9c4db3d14666873eb4779" address="unix:///run/containerd/s/18ee2edc6e34367e550ed84cf11996513bb697fe0502ab5e43b2f609e904d663" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:50:54.763284 systemd[1]: Started cri-containerd-a4c3203b0bc346b65c5172fe08bc6c39128a323fcad9c4db3d14666873eb4779.scope - libcontainer container a4c3203b0bc346b65c5172fe08bc6c39128a323fcad9c4db3d14666873eb4779. Sep 4 17:50:54.767129 kubelet[2704]: E0904 17:50:54.767049 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.767129 kubelet[2704]: W0904 17:50:54.767070 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.767431 kubelet[2704]: E0904 17:50:54.767091 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.767661 kubelet[2704]: E0904 17:50:54.767633 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.767661 kubelet[2704]: W0904 17:50:54.767646 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.767804 kubelet[2704]: E0904 17:50:54.767752 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.768026 kubelet[2704]: E0904 17:50:54.768001 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.768026 kubelet[2704]: W0904 17:50:54.768012 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.768221 kubelet[2704]: E0904 17:50:54.768208 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.768335 kubelet[2704]: E0904 17:50:54.768324 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.768394 kubelet[2704]: W0904 17:50:54.768382 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.768501 kubelet[2704]: E0904 17:50:54.768448 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.768739 kubelet[2704]: E0904 17:50:54.768726 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.768801 kubelet[2704]: W0904 17:50:54.768790 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.768865 kubelet[2704]: E0904 17:50:54.768855 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.770243 kubelet[2704]: E0904 17:50:54.770229 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.770403 kubelet[2704]: W0904 17:50:54.770321 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.770403 kubelet[2704]: E0904 17:50:54.770350 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.770702 kubelet[2704]: E0904 17:50:54.770688 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.770789 kubelet[2704]: W0904 17:50:54.770747 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.770853 kubelet[2704]: E0904 17:50:54.770838 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.771308 kubelet[2704]: E0904 17:50:54.771047 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.771428 kubelet[2704]: W0904 17:50:54.771371 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.771510 kubelet[2704]: E0904 17:50:54.771464 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.771835 kubelet[2704]: E0904 17:50:54.771809 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.771835 kubelet[2704]: W0904 17:50:54.771821 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.772128 kubelet[2704]: E0904 17:50:54.772080 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.772626 kubelet[2704]: E0904 17:50:54.772510 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.772737 kubelet[2704]: W0904 17:50:54.772522 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.773023 kubelet[2704]: E0904 17:50:54.773007 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.773884 kubelet[2704]: E0904 17:50:54.773777 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.773884 kubelet[2704]: W0904 17:50:54.773790 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.773986 kubelet[2704]: E0904 17:50:54.773973 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.774119 kubelet[2704]: E0904 17:50:54.774086 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.774119 kubelet[2704]: W0904 17:50:54.774096 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.774265 kubelet[2704]: E0904 17:50:54.774243 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.775071 kubelet[2704]: E0904 17:50:54.775032 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.775071 kubelet[2704]: W0904 17:50:54.775066 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.775173 kubelet[2704]: E0904 17:50:54.775144 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.775451 kubelet[2704]: E0904 17:50:54.775400 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.775451 kubelet[2704]: W0904 17:50:54.775416 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.775523 kubelet[2704]: E0904 17:50:54.775493 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.775691 kubelet[2704]: E0904 17:50:54.775669 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.775691 kubelet[2704]: W0904 17:50:54.775684 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.775748 kubelet[2704]: E0904 17:50:54.775742 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.775927 kubelet[2704]: E0904 17:50:54.775907 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.775927 kubelet[2704]: W0904 17:50:54.775921 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.776166 kubelet[2704]: E0904 17:50:54.775972 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.776199 kubelet[2704]: E0904 17:50:54.776170 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.776199 kubelet[2704]: W0904 17:50:54.776178 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.776240 kubelet[2704]: E0904 17:50:54.776199 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.776456 kubelet[2704]: E0904 17:50:54.776435 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.776456 kubelet[2704]: W0904 17:50:54.776448 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.776533 kubelet[2704]: E0904 17:50:54.776487 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.776776 kubelet[2704]: E0904 17:50:54.776758 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.776776 kubelet[2704]: W0904 17:50:54.776771 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.776835 kubelet[2704]: E0904 17:50:54.776785 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.777045 kubelet[2704]: E0904 17:50:54.777028 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.777045 kubelet[2704]: W0904 17:50:54.777039 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.777155 kubelet[2704]: E0904 17:50:54.777136 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.777344 kubelet[2704]: E0904 17:50:54.777328 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.777344 kubelet[2704]: W0904 17:50:54.777340 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.777408 kubelet[2704]: E0904 17:50:54.777389 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.777857 kubelet[2704]: E0904 17:50:54.777837 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.777857 kubelet[2704]: W0904 17:50:54.777851 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.777951 kubelet[2704]: E0904 17:50:54.777934 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.778282 kubelet[2704]: E0904 17:50:54.778138 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.778282 kubelet[2704]: W0904 17:50:54.778151 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.778282 kubelet[2704]: E0904 17:50:54.778171 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.778431 kubelet[2704]: E0904 17:50:54.778406 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.778431 kubelet[2704]: W0904 17:50:54.778426 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.778542 kubelet[2704]: E0904 17:50:54.778436 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.779123 kubelet[2704]: E0904 17:50:54.778914 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.779123 kubelet[2704]: W0904 17:50:54.778931 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.779123 kubelet[2704]: E0904 17:50:54.778943 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.790394 kubelet[2704]: E0904 17:50:54.790039 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:54.790394 kubelet[2704]: W0904 17:50:54.790386 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:54.790634 kubelet[2704]: E0904 17:50:54.790408 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:54.807386 containerd[1564]: time="2025-09-04T17:50:54.807344158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cxlkt,Uid:95bb4b8f-568d-4d78-93c9-b37d42ec5842,Namespace:calico-system,Attempt:0,} returns sandbox id \"a4c3203b0bc346b65c5172fe08bc6c39128a323fcad9c4db3d14666873eb4779\"" Sep 4 17:50:56.242326 kubelet[2704]: E0904 17:50:56.242244 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8mncz" podUID="81fd4b06-1bc3-423c-9205-8768e1e4b44a" Sep 4 17:50:56.348703 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount66239142.mount: Deactivated successfully. Sep 4 17:50:57.015798 containerd[1564]: time="2025-09-04T17:50:57.015710221Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:57.016599 containerd[1564]: time="2025-09-04T17:50:57.016542978Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 4 17:50:57.018021 containerd[1564]: time="2025-09-04T17:50:57.017998753Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:57.020092 containerd[1564]: time="2025-09-04T17:50:57.020033416Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:57.020531 containerd[1564]: time="2025-09-04T17:50:57.020477385Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.565702695s" Sep 4 17:50:57.020531 containerd[1564]: time="2025-09-04T17:50:57.020508966Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 4 17:50:57.022274 containerd[1564]: time="2025-09-04T17:50:57.022243770Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 4 17:50:57.030727 containerd[1564]: time="2025-09-04T17:50:57.030681042Z" level=info msg="CreateContainer within sandbox \"ec378020dbf4f9bd9c6c49a6178d6abddd11494883c9c171142a3f0de2f18f4f\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 4 17:50:57.042320 containerd[1564]: time="2025-09-04T17:50:57.042264942Z" level=info msg="Container 6b18da4cc8d2d1fbdcbbf0faf13ac6f5b5d33c9acd4296bc8faec5d97f681594: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:50:57.053948 containerd[1564]: time="2025-09-04T17:50:57.053899878Z" level=info msg="CreateContainer within sandbox \"ec378020dbf4f9bd9c6c49a6178d6abddd11494883c9c171142a3f0de2f18f4f\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"6b18da4cc8d2d1fbdcbbf0faf13ac6f5b5d33c9acd4296bc8faec5d97f681594\"" Sep 4 17:50:57.054327 containerd[1564]: time="2025-09-04T17:50:57.054306578Z" level=info msg="StartContainer for \"6b18da4cc8d2d1fbdcbbf0faf13ac6f5b5d33c9acd4296bc8faec5d97f681594\"" Sep 4 17:50:57.055452 containerd[1564]: time="2025-09-04T17:50:57.055430036Z" level=info msg="connecting to shim 6b18da4cc8d2d1fbdcbbf0faf13ac6f5b5d33c9acd4296bc8faec5d97f681594" address="unix:///run/containerd/s/5cbdfd46b36a0c98d7df8de017093cafac7d10af7af80e415b9816819cf5d7a9" protocol=ttrpc version=3 Sep 4 17:50:57.080247 systemd[1]: Started cri-containerd-6b18da4cc8d2d1fbdcbbf0faf13ac6f5b5d33c9acd4296bc8faec5d97f681594.scope - libcontainer container 6b18da4cc8d2d1fbdcbbf0faf13ac6f5b5d33c9acd4296bc8faec5d97f681594. Sep 4 17:50:57.137827 containerd[1564]: time="2025-09-04T17:50:57.137758383Z" level=info msg="StartContainer for \"6b18da4cc8d2d1fbdcbbf0faf13ac6f5b5d33c9acd4296bc8faec5d97f681594\" returns successfully" Sep 4 17:50:57.325150 kubelet[2704]: I0904 17:50:57.324863 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-797b89db9c-7zkkl" podStartSLOduration=1.756550472 podStartE2EDuration="4.324835753s" podCreationTimestamp="2025-09-04 17:50:53 +0000 UTC" firstStartedPulling="2025-09-04 17:50:54.453070017 +0000 UTC m=+17.310233874" lastFinishedPulling="2025-09-04 17:50:57.021355298 +0000 UTC m=+19.878519155" observedRunningTime="2025-09-04 17:50:57.323727315 +0000 UTC m=+20.180891172" watchObservedRunningTime="2025-09-04 17:50:57.324835753 +0000 UTC m=+20.181999610" Sep 4 17:50:57.385072 kubelet[2704]: E0904 17:50:57.385006 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.385072 kubelet[2704]: W0904 17:50:57.385035 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.385072 kubelet[2704]: E0904 17:50:57.385060 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.385398 kubelet[2704]: E0904 17:50:57.385379 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.385398 kubelet[2704]: W0904 17:50:57.385392 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.385456 kubelet[2704]: E0904 17:50:57.385401 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.385635 kubelet[2704]: E0904 17:50:57.385610 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.385635 kubelet[2704]: W0904 17:50:57.385622 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.385635 kubelet[2704]: E0904 17:50:57.385632 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.385813 kubelet[2704]: E0904 17:50:57.385797 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.385813 kubelet[2704]: W0904 17:50:57.385809 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.385867 kubelet[2704]: E0904 17:50:57.385817 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.386000 kubelet[2704]: E0904 17:50:57.385985 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.386000 kubelet[2704]: W0904 17:50:57.385995 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.386143 kubelet[2704]: E0904 17:50:57.386003 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.386186 kubelet[2704]: E0904 17:50:57.386168 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.386186 kubelet[2704]: W0904 17:50:57.386181 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.386238 kubelet[2704]: E0904 17:50:57.386189 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.386360 kubelet[2704]: E0904 17:50:57.386344 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.386360 kubelet[2704]: W0904 17:50:57.386355 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.386407 kubelet[2704]: E0904 17:50:57.386362 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.386527 kubelet[2704]: E0904 17:50:57.386512 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.386527 kubelet[2704]: W0904 17:50:57.386523 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.386596 kubelet[2704]: E0904 17:50:57.386531 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.386711 kubelet[2704]: E0904 17:50:57.386695 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.386711 kubelet[2704]: W0904 17:50:57.386706 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.386759 kubelet[2704]: E0904 17:50:57.386714 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.386871 kubelet[2704]: E0904 17:50:57.386856 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.386871 kubelet[2704]: W0904 17:50:57.386867 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.386924 kubelet[2704]: E0904 17:50:57.386875 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.387034 kubelet[2704]: E0904 17:50:57.387019 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.387034 kubelet[2704]: W0904 17:50:57.387029 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.387078 kubelet[2704]: E0904 17:50:57.387037 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.387221 kubelet[2704]: E0904 17:50:57.387206 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.387221 kubelet[2704]: W0904 17:50:57.387217 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.387277 kubelet[2704]: E0904 17:50:57.387226 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.387415 kubelet[2704]: E0904 17:50:57.387399 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.387415 kubelet[2704]: W0904 17:50:57.387410 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.387469 kubelet[2704]: E0904 17:50:57.387418 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.387579 kubelet[2704]: E0904 17:50:57.387564 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.387579 kubelet[2704]: W0904 17:50:57.387576 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.387641 kubelet[2704]: E0904 17:50:57.387595 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.387760 kubelet[2704]: E0904 17:50:57.387744 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.387760 kubelet[2704]: W0904 17:50:57.387755 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.387807 kubelet[2704]: E0904 17:50:57.387764 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.390155 kubelet[2704]: E0904 17:50:57.390132 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.390155 kubelet[2704]: W0904 17:50:57.390145 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.390155 kubelet[2704]: E0904 17:50:57.390154 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.390415 kubelet[2704]: E0904 17:50:57.390388 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.390415 kubelet[2704]: W0904 17:50:57.390401 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.390474 kubelet[2704]: E0904 17:50:57.390422 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.390671 kubelet[2704]: E0904 17:50:57.390643 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.390671 kubelet[2704]: W0904 17:50:57.390655 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.390722 kubelet[2704]: E0904 17:50:57.390675 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.390929 kubelet[2704]: E0904 17:50:57.390901 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.390929 kubelet[2704]: W0904 17:50:57.390915 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.390981 kubelet[2704]: E0904 17:50:57.390937 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.391152 kubelet[2704]: E0904 17:50:57.391133 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.391152 kubelet[2704]: W0904 17:50:57.391144 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.391202 kubelet[2704]: E0904 17:50:57.391164 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.391391 kubelet[2704]: E0904 17:50:57.391371 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.391391 kubelet[2704]: W0904 17:50:57.391383 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.391446 kubelet[2704]: E0904 17:50:57.391403 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.391654 kubelet[2704]: E0904 17:50:57.391635 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.391654 kubelet[2704]: W0904 17:50:57.391647 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.391709 kubelet[2704]: E0904 17:50:57.391694 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.392194 kubelet[2704]: E0904 17:50:57.392165 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.392194 kubelet[2704]: W0904 17:50:57.392178 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.392250 kubelet[2704]: E0904 17:50:57.392231 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.392407 kubelet[2704]: E0904 17:50:57.392381 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.392407 kubelet[2704]: W0904 17:50:57.392394 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.392456 kubelet[2704]: E0904 17:50:57.392440 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.392630 kubelet[2704]: E0904 17:50:57.392609 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.392630 kubelet[2704]: W0904 17:50:57.392621 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.392677 kubelet[2704]: E0904 17:50:57.392641 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.392892 kubelet[2704]: E0904 17:50:57.392873 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.392892 kubelet[2704]: W0904 17:50:57.392885 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.392935 kubelet[2704]: E0904 17:50:57.392906 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.393113 kubelet[2704]: E0904 17:50:57.393091 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.393143 kubelet[2704]: W0904 17:50:57.393117 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.393143 kubelet[2704]: E0904 17:50:57.393129 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.393334 kubelet[2704]: E0904 17:50:57.393315 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.393334 kubelet[2704]: W0904 17:50:57.393327 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.393382 kubelet[2704]: E0904 17:50:57.393338 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.393732 kubelet[2704]: E0904 17:50:57.393703 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.393732 kubelet[2704]: W0904 17:50:57.393717 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.393782 kubelet[2704]: E0904 17:50:57.393765 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.393973 kubelet[2704]: E0904 17:50:57.393946 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.393973 kubelet[2704]: W0904 17:50:57.393959 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.393973 kubelet[2704]: E0904 17:50:57.393967 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.394162 kubelet[2704]: E0904 17:50:57.394142 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.394162 kubelet[2704]: W0904 17:50:57.394155 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.394162 kubelet[2704]: E0904 17:50:57.394163 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.394361 kubelet[2704]: E0904 17:50:57.394343 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.394361 kubelet[2704]: W0904 17:50:57.394355 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.394404 kubelet[2704]: E0904 17:50:57.394363 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:57.394843 kubelet[2704]: E0904 17:50:57.394814 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:57.394843 kubelet[2704]: W0904 17:50:57.394828 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:57.394843 kubelet[2704]: E0904 17:50:57.394836 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.241250 kubelet[2704]: E0904 17:50:58.241144 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8mncz" podUID="81fd4b06-1bc3-423c-9205-8768e1e4b44a" Sep 4 17:50:58.302876 kubelet[2704]: I0904 17:50:58.302532 2704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 17:50:58.394597 kubelet[2704]: E0904 17:50:58.394544 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.394597 kubelet[2704]: W0904 17:50:58.394569 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.394597 kubelet[2704]: E0904 17:50:58.394602 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.395069 kubelet[2704]: E0904 17:50:58.394837 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.395069 kubelet[2704]: W0904 17:50:58.394846 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.395069 kubelet[2704]: E0904 17:50:58.394873 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.395069 kubelet[2704]: E0904 17:50:58.395059 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.395069 kubelet[2704]: W0904 17:50:58.395067 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.395210 kubelet[2704]: E0904 17:50:58.395077 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.395293 kubelet[2704]: E0904 17:50:58.395266 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.395293 kubelet[2704]: W0904 17:50:58.395280 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.395293 kubelet[2704]: E0904 17:50:58.395288 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.395475 kubelet[2704]: E0904 17:50:58.395453 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.395475 kubelet[2704]: W0904 17:50:58.395465 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.395475 kubelet[2704]: E0904 17:50:58.395472 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.395766 kubelet[2704]: E0904 17:50:58.395734 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.395766 kubelet[2704]: W0904 17:50:58.395754 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.395847 kubelet[2704]: E0904 17:50:58.395779 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.395988 kubelet[2704]: E0904 17:50:58.395973 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.395988 kubelet[2704]: W0904 17:50:58.395982 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.395988 kubelet[2704]: E0904 17:50:58.395990 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.396218 kubelet[2704]: E0904 17:50:58.396199 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.396218 kubelet[2704]: W0904 17:50:58.396212 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.396306 kubelet[2704]: E0904 17:50:58.396222 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.396463 kubelet[2704]: E0904 17:50:58.396444 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.396463 kubelet[2704]: W0904 17:50:58.396455 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.396463 kubelet[2704]: E0904 17:50:58.396464 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.396679 kubelet[2704]: E0904 17:50:58.396660 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.396679 kubelet[2704]: W0904 17:50:58.396671 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.396679 kubelet[2704]: E0904 17:50:58.396679 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.396871 kubelet[2704]: E0904 17:50:58.396852 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.396871 kubelet[2704]: W0904 17:50:58.396863 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.396871 kubelet[2704]: E0904 17:50:58.396870 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.397136 kubelet[2704]: E0904 17:50:58.397116 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.397136 kubelet[2704]: W0904 17:50:58.397129 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.397230 kubelet[2704]: E0904 17:50:58.397140 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.397338 kubelet[2704]: E0904 17:50:58.397317 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.397338 kubelet[2704]: W0904 17:50:58.397331 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.397338 kubelet[2704]: E0904 17:50:58.397338 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.397535 kubelet[2704]: E0904 17:50:58.397514 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.397535 kubelet[2704]: W0904 17:50:58.397526 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.397535 kubelet[2704]: E0904 17:50:58.397536 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.397757 kubelet[2704]: E0904 17:50:58.397738 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.397757 kubelet[2704]: W0904 17:50:58.397749 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.397757 kubelet[2704]: E0904 17:50:58.397758 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.398081 kubelet[2704]: E0904 17:50:58.398060 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.398081 kubelet[2704]: W0904 17:50:58.398072 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.398081 kubelet[2704]: E0904 17:50:58.398080 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.398391 kubelet[2704]: E0904 17:50:58.398337 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.398391 kubelet[2704]: W0904 17:50:58.398361 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.398491 kubelet[2704]: E0904 17:50:58.398402 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.398705 kubelet[2704]: E0904 17:50:58.398688 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.398705 kubelet[2704]: W0904 17:50:58.398702 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.398768 kubelet[2704]: E0904 17:50:58.398717 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.398965 kubelet[2704]: E0904 17:50:58.398928 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.398965 kubelet[2704]: W0904 17:50:58.398944 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.399149 kubelet[2704]: E0904 17:50:58.399010 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.399715 kubelet[2704]: E0904 17:50:58.399696 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.399715 kubelet[2704]: W0904 17:50:58.399710 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.399771 kubelet[2704]: E0904 17:50:58.399726 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.399987 kubelet[2704]: E0904 17:50:58.399962 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.399987 kubelet[2704]: W0904 17:50:58.399977 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.400129 kubelet[2704]: E0904 17:50:58.400017 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.400338 kubelet[2704]: E0904 17:50:58.400307 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.400338 kubelet[2704]: W0904 17:50:58.400322 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.400439 kubelet[2704]: E0904 17:50:58.400401 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.400598 kubelet[2704]: E0904 17:50:58.400568 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.400598 kubelet[2704]: W0904 17:50:58.400589 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.400665 kubelet[2704]: E0904 17:50:58.400601 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.400836 kubelet[2704]: E0904 17:50:58.400816 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.400836 kubelet[2704]: W0904 17:50:58.400827 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.400836 kubelet[2704]: E0904 17:50:58.400841 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.401027 kubelet[2704]: E0904 17:50:58.401008 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.401027 kubelet[2704]: W0904 17:50:58.401019 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.401096 kubelet[2704]: E0904 17:50:58.401035 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.401279 kubelet[2704]: E0904 17:50:58.401258 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.401279 kubelet[2704]: W0904 17:50:58.401273 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.401349 kubelet[2704]: E0904 17:50:58.401293 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.401495 kubelet[2704]: E0904 17:50:58.401477 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.401495 kubelet[2704]: W0904 17:50:58.401490 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.401582 kubelet[2704]: E0904 17:50:58.401506 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.401739 kubelet[2704]: E0904 17:50:58.401719 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.401739 kubelet[2704]: W0904 17:50:58.401731 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.401811 kubelet[2704]: E0904 17:50:58.401744 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.402023 kubelet[2704]: E0904 17:50:58.402002 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.402023 kubelet[2704]: W0904 17:50:58.402015 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.402096 kubelet[2704]: E0904 17:50:58.402033 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.402263 kubelet[2704]: E0904 17:50:58.402246 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.402263 kubelet[2704]: W0904 17:50:58.402258 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.402364 kubelet[2704]: E0904 17:50:58.402274 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.402491 kubelet[2704]: E0904 17:50:58.402473 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.402491 kubelet[2704]: W0904 17:50:58.402485 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.402570 kubelet[2704]: E0904 17:50:58.402502 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.402710 kubelet[2704]: E0904 17:50:58.402688 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.402710 kubelet[2704]: W0904 17:50:58.402700 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.402710 kubelet[2704]: E0904 17:50:58.402709 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:58.402959 kubelet[2704]: E0904 17:50:58.402941 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 4 17:50:58.402959 kubelet[2704]: W0904 17:50:58.402954 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 4 17:50:58.403032 kubelet[2704]: E0904 17:50:58.402965 2704 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 4 17:50:59.077470 containerd[1564]: time="2025-09-04T17:50:59.077414098Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:59.078277 containerd[1564]: time="2025-09-04T17:50:59.078241111Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 4 17:50:59.079595 containerd[1564]: time="2025-09-04T17:50:59.079371960Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:59.081344 containerd[1564]: time="2025-09-04T17:50:59.081305145Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:50:59.081726 containerd[1564]: time="2025-09-04T17:50:59.081694892Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 2.059416286s" Sep 4 17:50:59.081755 containerd[1564]: time="2025-09-04T17:50:59.081724529Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 4 17:50:59.083609 containerd[1564]: time="2025-09-04T17:50:59.083582722Z" level=info msg="CreateContainer within sandbox \"a4c3203b0bc346b65c5172fe08bc6c39128a323fcad9c4db3d14666873eb4779\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 4 17:50:59.092813 containerd[1564]: time="2025-09-04T17:50:59.092765587Z" level=info msg="Container d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:50:59.101224 containerd[1564]: time="2025-09-04T17:50:59.101164079Z" level=info msg="CreateContainer within sandbox \"a4c3203b0bc346b65c5172fe08bc6c39128a323fcad9c4db3d14666873eb4779\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b\"" Sep 4 17:50:59.101802 containerd[1564]: time="2025-09-04T17:50:59.101768722Z" level=info msg="StartContainer for \"d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b\"" Sep 4 17:50:59.103326 containerd[1564]: time="2025-09-04T17:50:59.103295359Z" level=info msg="connecting to shim d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b" address="unix:///run/containerd/s/18ee2edc6e34367e550ed84cf11996513bb697fe0502ab5e43b2f609e904d663" protocol=ttrpc version=3 Sep 4 17:50:59.127246 systemd[1]: Started cri-containerd-d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b.scope - libcontainer container d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b. Sep 4 17:50:59.169512 containerd[1564]: time="2025-09-04T17:50:59.169449698Z" level=info msg="StartContainer for \"d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b\" returns successfully" Sep 4 17:50:59.178044 systemd[1]: cri-containerd-d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b.scope: Deactivated successfully. Sep 4 17:50:59.180802 containerd[1564]: time="2025-09-04T17:50:59.180766348Z" level=info msg="received exit event container_id:\"d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b\" id:\"d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b\" pid:3397 exited_at:{seconds:1757008259 nanos:180360661}" Sep 4 17:50:59.180876 containerd[1564]: time="2025-09-04T17:50:59.180805983Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b\" id:\"d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b\" pid:3397 exited_at:{seconds:1757008259 nanos:180360661}" Sep 4 17:50:59.205440 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d593553dcf14739d99770b2ce37d550097dd340aeebd1693735aafae04745d6b-rootfs.mount: Deactivated successfully. Sep 4 17:51:00.241203 kubelet[2704]: E0904 17:51:00.241138 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8mncz" podUID="81fd4b06-1bc3-423c-9205-8768e1e4b44a" Sep 4 17:51:00.311732 containerd[1564]: time="2025-09-04T17:51:00.311638916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 4 17:51:02.241259 kubelet[2704]: E0904 17:51:02.241193 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8mncz" podUID="81fd4b06-1bc3-423c-9205-8768e1e4b44a" Sep 4 17:51:04.240785 kubelet[2704]: E0904 17:51:04.240713 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8mncz" podUID="81fd4b06-1bc3-423c-9205-8768e1e4b44a" Sep 4 17:51:05.418934 containerd[1564]: time="2025-09-04T17:51:05.418879148Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:05.419637 containerd[1564]: time="2025-09-04T17:51:05.419599446Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 4 17:51:05.420720 containerd[1564]: time="2025-09-04T17:51:05.420657542Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:05.422739 containerd[1564]: time="2025-09-04T17:51:05.422692219Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:05.423246 containerd[1564]: time="2025-09-04T17:51:05.423212070Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 5.111529893s" Sep 4 17:51:05.423246 containerd[1564]: time="2025-09-04T17:51:05.423242107Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 4 17:51:05.425174 containerd[1564]: time="2025-09-04T17:51:05.425144826Z" level=info msg="CreateContainer within sandbox \"a4c3203b0bc346b65c5172fe08bc6c39128a323fcad9c4db3d14666873eb4779\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 4 17:51:05.434313 containerd[1564]: time="2025-09-04T17:51:05.434265488Z" level=info msg="Container 73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:05.444213 containerd[1564]: time="2025-09-04T17:51:05.444176030Z" level=info msg="CreateContainer within sandbox \"a4c3203b0bc346b65c5172fe08bc6c39128a323fcad9c4db3d14666873eb4779\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97\"" Sep 4 17:51:05.445236 containerd[1564]: time="2025-09-04T17:51:05.444692925Z" level=info msg="StartContainer for \"73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97\"" Sep 4 17:51:05.445983 containerd[1564]: time="2025-09-04T17:51:05.445955215Z" level=info msg="connecting to shim 73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97" address="unix:///run/containerd/s/18ee2edc6e34367e550ed84cf11996513bb697fe0502ab5e43b2f609e904d663" protocol=ttrpc version=3 Sep 4 17:51:05.474236 systemd[1]: Started cri-containerd-73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97.scope - libcontainer container 73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97. Sep 4 17:51:05.515228 containerd[1564]: time="2025-09-04T17:51:05.515187495Z" level=info msg="StartContainer for \"73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97\" returns successfully" Sep 4 17:51:06.240805 kubelet[2704]: E0904 17:51:06.240720 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8mncz" podUID="81fd4b06-1bc3-423c-9205-8768e1e4b44a" Sep 4 17:51:07.011788 containerd[1564]: time="2025-09-04T17:51:07.011710817Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 4 17:51:07.014876 systemd[1]: cri-containerd-73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97.scope: Deactivated successfully. Sep 4 17:51:07.015250 systemd[1]: cri-containerd-73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97.scope: Consumed 631ms CPU time, 179.6M memory peak, 2.7M read from disk, 171.3M written to disk. Sep 4 17:51:07.016729 containerd[1564]: time="2025-09-04T17:51:07.016665165Z" level=info msg="received exit event container_id:\"73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97\" id:\"73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97\" pid:3458 exited_at:{seconds:1757008267 nanos:16400877}" Sep 4 17:51:07.016729 containerd[1564]: time="2025-09-04T17:51:07.016711623Z" level=info msg="TaskExit event in podsandbox handler container_id:\"73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97\" id:\"73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97\" pid:3458 exited_at:{seconds:1757008267 nanos:16400877}" Sep 4 17:51:07.038898 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-73ec62b9443be991b271f9708d6bfef24b023fa37dd5a7b3672b2ab405a96f97-rootfs.mount: Deactivated successfully. Sep 4 17:51:07.084612 kubelet[2704]: I0904 17:51:07.084562 2704 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 4 17:51:07.365999 systemd[1]: Created slice kubepods-burstable-poda9d9ac52_b01b_4920_9bfe_f3cca3aab3b1.slice - libcontainer container kubepods-burstable-poda9d9ac52_b01b_4920_9bfe_f3cca3aab3b1.slice. Sep 4 17:51:07.368129 kubelet[2704]: I0904 17:51:07.367395 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwpkd\" (UniqueName: \"kubernetes.io/projected/1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34-kube-api-access-dwpkd\") pod \"coredns-7c65d6cfc9-7bbsj\" (UID: \"1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34\") " pod="kube-system/coredns-7c65d6cfc9-7bbsj" Sep 4 17:51:07.368129 kubelet[2704]: I0904 17:51:07.367503 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/79f9dffd-1bb2-43ab-ba96-6a7b6baab522-goldmane-key-pair\") pod \"goldmane-7988f88666-7f6s9\" (UID: \"79f9dffd-1bb2-43ab-ba96-6a7b6baab522\") " pod="calico-system/goldmane-7988f88666-7f6s9" Sep 4 17:51:07.368129 kubelet[2704]: I0904 17:51:07.367529 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqfnk\" (UniqueName: \"kubernetes.io/projected/500d5572-b816-409e-8956-3b59c87319da-kube-api-access-hqfnk\") pod \"calico-apiserver-5bf46546c9-6pt6w\" (UID: \"500d5572-b816-409e-8956-3b59c87319da\") " pod="calico-apiserver/calico-apiserver-5bf46546c9-6pt6w" Sep 4 17:51:07.368129 kubelet[2704]: I0904 17:51:07.367739 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34-config-volume\") pod \"coredns-7c65d6cfc9-7bbsj\" (UID: \"1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34\") " pod="kube-system/coredns-7c65d6cfc9-7bbsj" Sep 4 17:51:07.368129 kubelet[2704]: I0904 17:51:07.367814 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1-config-volume\") pod \"coredns-7c65d6cfc9-lxxpj\" (UID: \"a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1\") " pod="kube-system/coredns-7c65d6cfc9-lxxpj" Sep 4 17:51:07.368570 kubelet[2704]: I0904 17:51:07.367839 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79f9dffd-1bb2-43ab-ba96-6a7b6baab522-config\") pod \"goldmane-7988f88666-7f6s9\" (UID: \"79f9dffd-1bb2-43ab-ba96-6a7b6baab522\") " pod="calico-system/goldmane-7988f88666-7f6s9" Sep 4 17:51:07.368570 kubelet[2704]: I0904 17:51:07.367885 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b01e23fd-381d-49b0-b985-5edc30fb4402-calico-apiserver-certs\") pod \"calico-apiserver-5bf46546c9-zlqdb\" (UID: \"b01e23fd-381d-49b0-b985-5edc30fb4402\") " pod="calico-apiserver/calico-apiserver-5bf46546c9-zlqdb" Sep 4 17:51:07.368570 kubelet[2704]: I0904 17:51:07.368253 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79f9dffd-1bb2-43ab-ba96-6a7b6baab522-goldmane-ca-bundle\") pod \"goldmane-7988f88666-7f6s9\" (UID: \"79f9dffd-1bb2-43ab-ba96-6a7b6baab522\") " pod="calico-system/goldmane-7988f88666-7f6s9" Sep 4 17:51:07.368570 kubelet[2704]: I0904 17:51:07.368321 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76446953-bc18-4027-94fd-31764ac45951-tigera-ca-bundle\") pod \"calico-kube-controllers-5cfb9fdb56-hqdqm\" (UID: \"76446953-bc18-4027-94fd-31764ac45951\") " pod="calico-system/calico-kube-controllers-5cfb9fdb56-hqdqm" Sep 4 17:51:07.368570 kubelet[2704]: I0904 17:51:07.368355 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2bvq\" (UniqueName: \"kubernetes.io/projected/b01e23fd-381d-49b0-b985-5edc30fb4402-kube-api-access-t2bvq\") pod \"calico-apiserver-5bf46546c9-zlqdb\" (UID: \"b01e23fd-381d-49b0-b985-5edc30fb4402\") " pod="calico-apiserver/calico-apiserver-5bf46546c9-zlqdb" Sep 4 17:51:07.369515 kubelet[2704]: I0904 17:51:07.368401 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltd22\" (UniqueName: \"kubernetes.io/projected/a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1-kube-api-access-ltd22\") pod \"coredns-7c65d6cfc9-lxxpj\" (UID: \"a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1\") " pod="kube-system/coredns-7c65d6cfc9-lxxpj" Sep 4 17:51:07.369515 kubelet[2704]: I0904 17:51:07.368425 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvbtb\" (UniqueName: \"kubernetes.io/projected/76446953-bc18-4027-94fd-31764ac45951-kube-api-access-fvbtb\") pod \"calico-kube-controllers-5cfb9fdb56-hqdqm\" (UID: \"76446953-bc18-4027-94fd-31764ac45951\") " pod="calico-system/calico-kube-controllers-5cfb9fdb56-hqdqm" Sep 4 17:51:07.369515 kubelet[2704]: I0904 17:51:07.368599 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6de06474-82ff-4369-8fcf-667fedc2e13c-whisker-backend-key-pair\") pod \"whisker-6b7d7c8c45-6wvjh\" (UID: \"6de06474-82ff-4369-8fcf-667fedc2e13c\") " pod="calico-system/whisker-6b7d7c8c45-6wvjh" Sep 4 17:51:07.369515 kubelet[2704]: I0904 17:51:07.368631 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p597\" (UniqueName: \"kubernetes.io/projected/6de06474-82ff-4369-8fcf-667fedc2e13c-kube-api-access-9p597\") pod \"whisker-6b7d7c8c45-6wvjh\" (UID: \"6de06474-82ff-4369-8fcf-667fedc2e13c\") " pod="calico-system/whisker-6b7d7c8c45-6wvjh" Sep 4 17:51:07.369515 kubelet[2704]: I0904 17:51:07.368674 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xw2k\" (UniqueName: \"kubernetes.io/projected/79f9dffd-1bb2-43ab-ba96-6a7b6baab522-kube-api-access-9xw2k\") pod \"goldmane-7988f88666-7f6s9\" (UID: \"79f9dffd-1bb2-43ab-ba96-6a7b6baab522\") " pod="calico-system/goldmane-7988f88666-7f6s9" Sep 4 17:51:07.369634 kubelet[2704]: I0904 17:51:07.368697 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/500d5572-b816-409e-8956-3b59c87319da-calico-apiserver-certs\") pod \"calico-apiserver-5bf46546c9-6pt6w\" (UID: \"500d5572-b816-409e-8956-3b59c87319da\") " pod="calico-apiserver/calico-apiserver-5bf46546c9-6pt6w" Sep 4 17:51:07.369634 kubelet[2704]: I0904 17:51:07.368742 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6de06474-82ff-4369-8fcf-667fedc2e13c-whisker-ca-bundle\") pod \"whisker-6b7d7c8c45-6wvjh\" (UID: \"6de06474-82ff-4369-8fcf-667fedc2e13c\") " pod="calico-system/whisker-6b7d7c8c45-6wvjh" Sep 4 17:51:07.381658 systemd[1]: Created slice kubepods-besteffort-pod76446953_bc18_4027_94fd_31764ac45951.slice - libcontainer container kubepods-besteffort-pod76446953_bc18_4027_94fd_31764ac45951.slice. Sep 4 17:51:07.390175 systemd[1]: Created slice kubepods-besteffort-pod500d5572_b816_409e_8956_3b59c87319da.slice - libcontainer container kubepods-besteffort-pod500d5572_b816_409e_8956_3b59c87319da.slice. Sep 4 17:51:07.396869 systemd[1]: Created slice kubepods-besteffort-pod79f9dffd_1bb2_43ab_ba96_6a7b6baab522.slice - libcontainer container kubepods-besteffort-pod79f9dffd_1bb2_43ab_ba96_6a7b6baab522.slice. Sep 4 17:51:07.404353 systemd[1]: Created slice kubepods-besteffort-podb01e23fd_381d_49b0_b985_5edc30fb4402.slice - libcontainer container kubepods-besteffort-podb01e23fd_381d_49b0_b985_5edc30fb4402.slice. Sep 4 17:51:07.411624 systemd[1]: Created slice kubepods-besteffort-pod6de06474_82ff_4369_8fcf_667fedc2e13c.slice - libcontainer container kubepods-besteffort-pod6de06474_82ff_4369_8fcf_667fedc2e13c.slice. Sep 4 17:51:07.417572 systemd[1]: Created slice kubepods-burstable-pod1cd8d6fb_6cf4_4a7d_af5a_3965e83c2a34.slice - libcontainer container kubepods-burstable-pod1cd8d6fb_6cf4_4a7d_af5a_3965e83c2a34.slice. Sep 4 17:51:07.674225 containerd[1564]: time="2025-09-04T17:51:07.674088214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lxxpj,Uid:a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1,Namespace:kube-system,Attempt:0,}" Sep 4 17:51:07.685930 containerd[1564]: time="2025-09-04T17:51:07.685896211Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cfb9fdb56-hqdqm,Uid:76446953-bc18-4027-94fd-31764ac45951,Namespace:calico-system,Attempt:0,}" Sep 4 17:51:07.693906 containerd[1564]: time="2025-09-04T17:51:07.693855994Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf46546c9-6pt6w,Uid:500d5572-b816-409e-8956-3b59c87319da,Namespace:calico-apiserver,Attempt:0,}" Sep 4 17:51:07.701647 containerd[1564]: time="2025-09-04T17:51:07.701567399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-7f6s9,Uid:79f9dffd-1bb2-43ab-ba96-6a7b6baab522,Namespace:calico-system,Attempt:0,}" Sep 4 17:51:07.709723 containerd[1564]: time="2025-09-04T17:51:07.709689608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf46546c9-zlqdb,Uid:b01e23fd-381d-49b0-b985-5edc30fb4402,Namespace:calico-apiserver,Attempt:0,}" Sep 4 17:51:07.717955 containerd[1564]: time="2025-09-04T17:51:07.717920752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b7d7c8c45-6wvjh,Uid:6de06474-82ff-4369-8fcf-667fedc2e13c,Namespace:calico-system,Attempt:0,}" Sep 4 17:51:07.722591 containerd[1564]: time="2025-09-04T17:51:07.722294305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7bbsj,Uid:1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34,Namespace:kube-system,Attempt:0,}" Sep 4 17:51:07.783384 containerd[1564]: time="2025-09-04T17:51:07.783326922Z" level=error msg="Failed to destroy network for sandbox \"041088241395c66260d0e1b33b60e7f05c07ab0c28abc8007ba31afa023a1649\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.789517 containerd[1564]: time="2025-09-04T17:51:07.789272890Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lxxpj,Uid:a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"041088241395c66260d0e1b33b60e7f05c07ab0c28abc8007ba31afa023a1649\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.790217 kubelet[2704]: E0904 17:51:07.790152 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"041088241395c66260d0e1b33b60e7f05c07ab0c28abc8007ba31afa023a1649\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.790347 kubelet[2704]: E0904 17:51:07.790322 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"041088241395c66260d0e1b33b60e7f05c07ab0c28abc8007ba31afa023a1649\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-lxxpj" Sep 4 17:51:07.790422 kubelet[2704]: E0904 17:51:07.790369 2704 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"041088241395c66260d0e1b33b60e7f05c07ab0c28abc8007ba31afa023a1649\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-lxxpj" Sep 4 17:51:07.790475 kubelet[2704]: E0904 17:51:07.790419 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-lxxpj_kube-system(a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-lxxpj_kube-system(a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"041088241395c66260d0e1b33b60e7f05c07ab0c28abc8007ba31afa023a1649\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-lxxpj" podUID="a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1" Sep 4 17:51:07.800373 containerd[1564]: time="2025-09-04T17:51:07.800244600Z" level=error msg="Failed to destroy network for sandbox \"20bdbfb77c47048767d2fe14a99475e935c598fd8d3343003cf05f3339930ef6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.805943 containerd[1564]: time="2025-09-04T17:51:07.805805682Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cfb9fdb56-hqdqm,Uid:76446953-bc18-4027-94fd-31764ac45951,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"20bdbfb77c47048767d2fe14a99475e935c598fd8d3343003cf05f3339930ef6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.806166 kubelet[2704]: E0904 17:51:07.806051 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20bdbfb77c47048767d2fe14a99475e935c598fd8d3343003cf05f3339930ef6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.806166 kubelet[2704]: E0904 17:51:07.806135 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20bdbfb77c47048767d2fe14a99475e935c598fd8d3343003cf05f3339930ef6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5cfb9fdb56-hqdqm" Sep 4 17:51:07.806166 kubelet[2704]: E0904 17:51:07.806155 2704 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"20bdbfb77c47048767d2fe14a99475e935c598fd8d3343003cf05f3339930ef6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5cfb9fdb56-hqdqm" Sep 4 17:51:07.806266 kubelet[2704]: E0904 17:51:07.806213 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5cfb9fdb56-hqdqm_calico-system(76446953-bc18-4027-94fd-31764ac45951)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5cfb9fdb56-hqdqm_calico-system(76446953-bc18-4027-94fd-31764ac45951)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"20bdbfb77c47048767d2fe14a99475e935c598fd8d3343003cf05f3339930ef6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5cfb9fdb56-hqdqm" podUID="76446953-bc18-4027-94fd-31764ac45951" Sep 4 17:51:07.808940 containerd[1564]: time="2025-09-04T17:51:07.808877401Z" level=error msg="Failed to destroy network for sandbox \"486576f15febf10ddcfa6efa08c5288f054aeb534f021af7f119a474b082383c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.810517 containerd[1564]: time="2025-09-04T17:51:07.810481535Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf46546c9-6pt6w,Uid:500d5572-b816-409e-8956-3b59c87319da,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"486576f15febf10ddcfa6efa08c5288f054aeb534f021af7f119a474b082383c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.810758 kubelet[2704]: E0904 17:51:07.810720 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"486576f15febf10ddcfa6efa08c5288f054aeb534f021af7f119a474b082383c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.810828 kubelet[2704]: E0904 17:51:07.810768 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"486576f15febf10ddcfa6efa08c5288f054aeb534f021af7f119a474b082383c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5bf46546c9-6pt6w" Sep 4 17:51:07.810828 kubelet[2704]: E0904 17:51:07.810784 2704 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"486576f15febf10ddcfa6efa08c5288f054aeb534f021af7f119a474b082383c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5bf46546c9-6pt6w" Sep 4 17:51:07.810828 kubelet[2704]: E0904 17:51:07.810818 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5bf46546c9-6pt6w_calico-apiserver(500d5572-b816-409e-8956-3b59c87319da)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5bf46546c9-6pt6w_calico-apiserver(500d5572-b816-409e-8956-3b59c87319da)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"486576f15febf10ddcfa6efa08c5288f054aeb534f021af7f119a474b082383c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5bf46546c9-6pt6w" podUID="500d5572-b816-409e-8956-3b59c87319da" Sep 4 17:51:07.815731 containerd[1564]: time="2025-09-04T17:51:07.815616964Z" level=error msg="Failed to destroy network for sandbox \"c5b22ce40a92df0ec7a2980d09f014d61d2fa1b05899d45ba110e2436fa7d0a9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.816797 containerd[1564]: time="2025-09-04T17:51:07.816773263Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf46546c9-zlqdb,Uid:b01e23fd-381d-49b0-b985-5edc30fb4402,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5b22ce40a92df0ec7a2980d09f014d61d2fa1b05899d45ba110e2436fa7d0a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.817140 kubelet[2704]: E0904 17:51:07.817093 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5b22ce40a92df0ec7a2980d09f014d61d2fa1b05899d45ba110e2436fa7d0a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.817274 kubelet[2704]: E0904 17:51:07.817256 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5b22ce40a92df0ec7a2980d09f014d61d2fa1b05899d45ba110e2436fa7d0a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5bf46546c9-zlqdb" Sep 4 17:51:07.817344 kubelet[2704]: E0904 17:51:07.817330 2704 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5b22ce40a92df0ec7a2980d09f014d61d2fa1b05899d45ba110e2436fa7d0a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5bf46546c9-zlqdb" Sep 4 17:51:07.817461 kubelet[2704]: E0904 17:51:07.817428 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5bf46546c9-zlqdb_calico-apiserver(b01e23fd-381d-49b0-b985-5edc30fb4402)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5bf46546c9-zlqdb_calico-apiserver(b01e23fd-381d-49b0-b985-5edc30fb4402)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c5b22ce40a92df0ec7a2980d09f014d61d2fa1b05899d45ba110e2436fa7d0a9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5bf46546c9-zlqdb" podUID="b01e23fd-381d-49b0-b985-5edc30fb4402" Sep 4 17:51:07.822832 containerd[1564]: time="2025-09-04T17:51:07.822779634Z" level=error msg="Failed to destroy network for sandbox \"399833fa2b7d1bbe27f35fac20bbd4f90c336a0ccc54d8b52ee6a4bfc32253a4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.823219 containerd[1564]: time="2025-09-04T17:51:07.823178717Z" level=error msg="Failed to destroy network for sandbox \"55f15eb5be7432004611fe8cd204e77c3ce0b4476f314f9bb9815e8fa35f7355\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.824944 containerd[1564]: time="2025-09-04T17:51:07.824881566Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7bbsj,Uid:1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"399833fa2b7d1bbe27f35fac20bbd4f90c336a0ccc54d8b52ee6a4bfc32253a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.825341 kubelet[2704]: E0904 17:51:07.825296 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"399833fa2b7d1bbe27f35fac20bbd4f90c336a0ccc54d8b52ee6a4bfc32253a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.825449 kubelet[2704]: E0904 17:51:07.825359 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"399833fa2b7d1bbe27f35fac20bbd4f90c336a0ccc54d8b52ee6a4bfc32253a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-7bbsj" Sep 4 17:51:07.825449 kubelet[2704]: E0904 17:51:07.825380 2704 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"399833fa2b7d1bbe27f35fac20bbd4f90c336a0ccc54d8b52ee6a4bfc32253a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-7bbsj" Sep 4 17:51:07.825449 kubelet[2704]: E0904 17:51:07.825420 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-7bbsj_kube-system(1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-7bbsj_kube-system(1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"399833fa2b7d1bbe27f35fac20bbd4f90c336a0ccc54d8b52ee6a4bfc32253a4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-7bbsj" podUID="1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34" Sep 4 17:51:07.825979 containerd[1564]: time="2025-09-04T17:51:07.825924132Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-7f6s9,Uid:79f9dffd-1bb2-43ab-ba96-6a7b6baab522,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"55f15eb5be7432004611fe8cd204e77c3ce0b4476f314f9bb9815e8fa35f7355\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.826837 kubelet[2704]: E0904 17:51:07.826086 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55f15eb5be7432004611fe8cd204e77c3ce0b4476f314f9bb9815e8fa35f7355\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.826837 kubelet[2704]: E0904 17:51:07.826174 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55f15eb5be7432004611fe8cd204e77c3ce0b4476f314f9bb9815e8fa35f7355\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-7f6s9" Sep 4 17:51:07.826837 kubelet[2704]: E0904 17:51:07.826192 2704 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"55f15eb5be7432004611fe8cd204e77c3ce0b4476f314f9bb9815e8fa35f7355\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-7f6s9" Sep 4 17:51:07.826967 kubelet[2704]: E0904 17:51:07.826227 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-7f6s9_calico-system(79f9dffd-1bb2-43ab-ba96-6a7b6baab522)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-7f6s9_calico-system(79f9dffd-1bb2-43ab-ba96-6a7b6baab522)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"55f15eb5be7432004611fe8cd204e77c3ce0b4476f314f9bb9815e8fa35f7355\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-7f6s9" podUID="79f9dffd-1bb2-43ab-ba96-6a7b6baab522" Sep 4 17:51:07.830992 containerd[1564]: time="2025-09-04T17:51:07.830950065Z" level=error msg="Failed to destroy network for sandbox \"e973b56c245891cbe4de02ac4e39c0ac4d49d1fc8dcf0724c3ee9a3ef13c67ca\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.832023 containerd[1564]: time="2025-09-04T17:51:07.831982250Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6b7d7c8c45-6wvjh,Uid:6de06474-82ff-4369-8fcf-667fedc2e13c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e973b56c245891cbe4de02ac4e39c0ac4d49d1fc8dcf0724c3ee9a3ef13c67ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.832198 kubelet[2704]: E0904 17:51:07.832163 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e973b56c245891cbe4de02ac4e39c0ac4d49d1fc8dcf0724c3ee9a3ef13c67ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:07.832250 kubelet[2704]: E0904 17:51:07.832218 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e973b56c245891cbe4de02ac4e39c0ac4d49d1fc8dcf0724c3ee9a3ef13c67ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6b7d7c8c45-6wvjh" Sep 4 17:51:07.832250 kubelet[2704]: E0904 17:51:07.832237 2704 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e973b56c245891cbe4de02ac4e39c0ac4d49d1fc8dcf0724c3ee9a3ef13c67ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6b7d7c8c45-6wvjh" Sep 4 17:51:07.832301 kubelet[2704]: E0904 17:51:07.832285 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6b7d7c8c45-6wvjh_calico-system(6de06474-82ff-4369-8fcf-667fedc2e13c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6b7d7c8c45-6wvjh_calico-system(6de06474-82ff-4369-8fcf-667fedc2e13c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e973b56c245891cbe4de02ac4e39c0ac4d49d1fc8dcf0724c3ee9a3ef13c67ca\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6b7d7c8c45-6wvjh" podUID="6de06474-82ff-4369-8fcf-667fedc2e13c" Sep 4 17:51:08.245964 systemd[1]: Created slice kubepods-besteffort-pod81fd4b06_1bc3_423c_9205_8768e1e4b44a.slice - libcontainer container kubepods-besteffort-pod81fd4b06_1bc3_423c_9205_8768e1e4b44a.slice. Sep 4 17:51:08.248395 containerd[1564]: time="2025-09-04T17:51:08.248352031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8mncz,Uid:81fd4b06-1bc3-423c-9205-8768e1e4b44a,Namespace:calico-system,Attempt:0,}" Sep 4 17:51:08.296675 containerd[1564]: time="2025-09-04T17:51:08.296606315Z" level=error msg="Failed to destroy network for sandbox \"30bac41fc89c0ffdc4abb84bc57ae1fb087d7ae197cbe2dd5594bf3eca663780\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:08.298055 containerd[1564]: time="2025-09-04T17:51:08.298024247Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8mncz,Uid:81fd4b06-1bc3-423c-9205-8768e1e4b44a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"30bac41fc89c0ffdc4abb84bc57ae1fb087d7ae197cbe2dd5594bf3eca663780\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:08.298270 kubelet[2704]: E0904 17:51:08.298236 2704 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30bac41fc89c0ffdc4abb84bc57ae1fb087d7ae197cbe2dd5594bf3eca663780\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 4 17:51:08.298331 kubelet[2704]: E0904 17:51:08.298296 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30bac41fc89c0ffdc4abb84bc57ae1fb087d7ae197cbe2dd5594bf3eca663780\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8mncz" Sep 4 17:51:08.298331 kubelet[2704]: E0904 17:51:08.298316 2704 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"30bac41fc89c0ffdc4abb84bc57ae1fb087d7ae197cbe2dd5594bf3eca663780\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8mncz" Sep 4 17:51:08.298478 kubelet[2704]: E0904 17:51:08.298365 2704 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8mncz_calico-system(81fd4b06-1bc3-423c-9205-8768e1e4b44a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8mncz_calico-system(81fd4b06-1bc3-423c-9205-8768e1e4b44a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"30bac41fc89c0ffdc4abb84bc57ae1fb087d7ae197cbe2dd5594bf3eca663780\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8mncz" podUID="81fd4b06-1bc3-423c-9205-8768e1e4b44a" Sep 4 17:51:08.298874 systemd[1]: run-netns-cni\x2d1c070949\x2d8d95\x2d8dcf\x2d88d9\x2d1004bea02b8f.mount: Deactivated successfully. Sep 4 17:51:08.333613 containerd[1564]: time="2025-09-04T17:51:08.333517186Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 4 17:51:09.808780 kubelet[2704]: I0904 17:51:09.808715 2704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 4 17:51:15.190910 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount79254564.mount: Deactivated successfully. Sep 4 17:51:16.420133 containerd[1564]: time="2025-09-04T17:51:16.420052013Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:16.420936 containerd[1564]: time="2025-09-04T17:51:16.420899908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 4 17:51:16.422361 containerd[1564]: time="2025-09-04T17:51:16.422315640Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:16.439685 containerd[1564]: time="2025-09-04T17:51:16.439622161Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:16.440344 containerd[1564]: time="2025-09-04T17:51:16.440296559Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 8.106735861s" Sep 4 17:51:16.440344 containerd[1564]: time="2025-09-04T17:51:16.440329782Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 4 17:51:16.449878 containerd[1564]: time="2025-09-04T17:51:16.449821073Z" level=info msg="CreateContainer within sandbox \"a4c3203b0bc346b65c5172fe08bc6c39128a323fcad9c4db3d14666873eb4779\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 4 17:51:16.461492 containerd[1564]: time="2025-09-04T17:51:16.461433024Z" level=info msg="Container 185689874e428c9987e558bee86d514a724b8f5f5f4674f5cc23d74e0fa1b3bf: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:16.472176 containerd[1564]: time="2025-09-04T17:51:16.472143097Z" level=info msg="CreateContainer within sandbox \"a4c3203b0bc346b65c5172fe08bc6c39128a323fcad9c4db3d14666873eb4779\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"185689874e428c9987e558bee86d514a724b8f5f5f4674f5cc23d74e0fa1b3bf\"" Sep 4 17:51:16.472876 containerd[1564]: time="2025-09-04T17:51:16.472718850Z" level=info msg="StartContainer for \"185689874e428c9987e558bee86d514a724b8f5f5f4674f5cc23d74e0fa1b3bf\"" Sep 4 17:51:16.474148 containerd[1564]: time="2025-09-04T17:51:16.474064741Z" level=info msg="connecting to shim 185689874e428c9987e558bee86d514a724b8f5f5f4674f5cc23d74e0fa1b3bf" address="unix:///run/containerd/s/18ee2edc6e34367e550ed84cf11996513bb697fe0502ab5e43b2f609e904d663" protocol=ttrpc version=3 Sep 4 17:51:16.493269 systemd[1]: Started cri-containerd-185689874e428c9987e558bee86d514a724b8f5f5f4674f5cc23d74e0fa1b3bf.scope - libcontainer container 185689874e428c9987e558bee86d514a724b8f5f5f4674f5cc23d74e0fa1b3bf. Sep 4 17:51:16.735134 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 4 17:51:16.735321 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 4 17:51:16.811000 containerd[1564]: time="2025-09-04T17:51:16.810939903Z" level=info msg="StartContainer for \"185689874e428c9987e558bee86d514a724b8f5f5f4674f5cc23d74e0fa1b3bf\" returns successfully" Sep 4 17:51:17.129591 kubelet[2704]: I0904 17:51:17.129532 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6de06474-82ff-4369-8fcf-667fedc2e13c-whisker-ca-bundle\") pod \"6de06474-82ff-4369-8fcf-667fedc2e13c\" (UID: \"6de06474-82ff-4369-8fcf-667fedc2e13c\") " Sep 4 17:51:17.131255 kubelet[2704]: I0904 17:51:17.130054 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p597\" (UniqueName: \"kubernetes.io/projected/6de06474-82ff-4369-8fcf-667fedc2e13c-kube-api-access-9p597\") pod \"6de06474-82ff-4369-8fcf-667fedc2e13c\" (UID: \"6de06474-82ff-4369-8fcf-667fedc2e13c\") " Sep 4 17:51:17.131255 kubelet[2704]: I0904 17:51:17.130081 2704 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6de06474-82ff-4369-8fcf-667fedc2e13c-whisker-backend-key-pair\") pod \"6de06474-82ff-4369-8fcf-667fedc2e13c\" (UID: \"6de06474-82ff-4369-8fcf-667fedc2e13c\") " Sep 4 17:51:17.131255 kubelet[2704]: I0904 17:51:17.130011 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6de06474-82ff-4369-8fcf-667fedc2e13c-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "6de06474-82ff-4369-8fcf-667fedc2e13c" (UID: "6de06474-82ff-4369-8fcf-667fedc2e13c"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 4 17:51:17.135500 kubelet[2704]: I0904 17:51:17.135438 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de06474-82ff-4369-8fcf-667fedc2e13c-kube-api-access-9p597" (OuterVolumeSpecName: "kube-api-access-9p597") pod "6de06474-82ff-4369-8fcf-667fedc2e13c" (UID: "6de06474-82ff-4369-8fcf-667fedc2e13c"). InnerVolumeSpecName "kube-api-access-9p597". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 4 17:51:17.138211 kubelet[2704]: I0904 17:51:17.138168 2704 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6de06474-82ff-4369-8fcf-667fedc2e13c-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "6de06474-82ff-4369-8fcf-667fedc2e13c" (UID: "6de06474-82ff-4369-8fcf-667fedc2e13c"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 4 17:51:17.230649 kubelet[2704]: I0904 17:51:17.230592 2704 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6de06474-82ff-4369-8fcf-667fedc2e13c-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Sep 4 17:51:17.230649 kubelet[2704]: I0904 17:51:17.230627 2704 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6de06474-82ff-4369-8fcf-667fedc2e13c-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Sep 4 17:51:17.230649 kubelet[2704]: I0904 17:51:17.230636 2704 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p597\" (UniqueName: \"kubernetes.io/projected/6de06474-82ff-4369-8fcf-667fedc2e13c-kube-api-access-9p597\") on node \"localhost\" DevicePath \"\"" Sep 4 17:51:17.255978 systemd[1]: Removed slice kubepods-besteffort-pod6de06474_82ff_4369_8fcf_667fedc2e13c.slice - libcontainer container kubepods-besteffort-pod6de06474_82ff_4369_8fcf_667fedc2e13c.slice. Sep 4 17:51:17.380850 kubelet[2704]: I0904 17:51:17.380655 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-cxlkt" podStartSLOduration=1.7496230179999999 podStartE2EDuration="23.380628716s" podCreationTimestamp="2025-09-04 17:50:54 +0000 UTC" firstStartedPulling="2025-09-04 17:50:54.809969889 +0000 UTC m=+17.667133746" lastFinishedPulling="2025-09-04 17:51:16.440975587 +0000 UTC m=+39.298139444" observedRunningTime="2025-09-04 17:51:17.380531542 +0000 UTC m=+40.237695399" watchObservedRunningTime="2025-09-04 17:51:17.380628716 +0000 UTC m=+40.237792573" Sep 4 17:51:17.432695 kubelet[2704]: I0904 17:51:17.432551 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b05f926c-4f67-49c6-b5a1-94e1657f90fa-whisker-backend-key-pair\") pod \"whisker-54f456cb4-6p4xx\" (UID: \"b05f926c-4f67-49c6-b5a1-94e1657f90fa\") " pod="calico-system/whisker-54f456cb4-6p4xx" Sep 4 17:51:17.434127 kubelet[2704]: I0904 17:51:17.434025 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2vxt\" (UniqueName: \"kubernetes.io/projected/b05f926c-4f67-49c6-b5a1-94e1657f90fa-kube-api-access-n2vxt\") pod \"whisker-54f456cb4-6p4xx\" (UID: \"b05f926c-4f67-49c6-b5a1-94e1657f90fa\") " pod="calico-system/whisker-54f456cb4-6p4xx" Sep 4 17:51:17.434127 kubelet[2704]: I0904 17:51:17.434076 2704 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b05f926c-4f67-49c6-b5a1-94e1657f90fa-whisker-ca-bundle\") pod \"whisker-54f456cb4-6p4xx\" (UID: \"b05f926c-4f67-49c6-b5a1-94e1657f90fa\") " pod="calico-system/whisker-54f456cb4-6p4xx" Sep 4 17:51:17.440423 systemd[1]: Created slice kubepods-besteffort-podb05f926c_4f67_49c6_b5a1_94e1657f90fa.slice - libcontainer container kubepods-besteffort-podb05f926c_4f67_49c6_b5a1_94e1657f90fa.slice. Sep 4 17:51:17.446707 systemd[1]: var-lib-kubelet-pods-6de06474\x2d82ff\x2d4369\x2d8fcf\x2d667fedc2e13c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d9p597.mount: Deactivated successfully. Sep 4 17:51:17.447234 systemd[1]: var-lib-kubelet-pods-6de06474\x2d82ff\x2d4369\x2d8fcf\x2d667fedc2e13c-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 4 17:51:17.530649 containerd[1564]: time="2025-09-04T17:51:17.530597041Z" level=info msg="TaskExit event in podsandbox handler container_id:\"185689874e428c9987e558bee86d514a724b8f5f5f4674f5cc23d74e0fa1b3bf\" id:\"ee49e8472f7a317d92c645518716e7b16fd5868a04620b2ed063fe812eb38b28\" pid:3850 exit_status:1 exited_at:{seconds:1757008277 nanos:530253636}" Sep 4 17:51:17.747478 containerd[1564]: time="2025-09-04T17:51:17.747326361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54f456cb4-6p4xx,Uid:b05f926c-4f67-49c6-b5a1-94e1657f90fa,Namespace:calico-system,Attempt:0,}" Sep 4 17:51:17.904547 systemd-networkd[1492]: calid46347b0270: Link UP Sep 4 17:51:17.904904 systemd-networkd[1492]: calid46347b0270: Gained carrier Sep 4 17:51:17.918208 containerd[1564]: 2025-09-04 17:51:17.775 [INFO][3866] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 4 17:51:17.918208 containerd[1564]: 2025-09-04 17:51:17.795 [INFO][3866] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--54f456cb4--6p4xx-eth0 whisker-54f456cb4- calico-system b05f926c-4f67-49c6-b5a1-94e1657f90fa 872 0 2025-09-04 17:51:17 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:54f456cb4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-54f456cb4-6p4xx eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calid46347b0270 [] [] }} ContainerID="5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" Namespace="calico-system" Pod="whisker-54f456cb4-6p4xx" WorkloadEndpoint="localhost-k8s-whisker--54f456cb4--6p4xx-" Sep 4 17:51:17.918208 containerd[1564]: 2025-09-04 17:51:17.795 [INFO][3866] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" Namespace="calico-system" Pod="whisker-54f456cb4-6p4xx" WorkloadEndpoint="localhost-k8s-whisker--54f456cb4--6p4xx-eth0" Sep 4 17:51:17.918208 containerd[1564]: 2025-09-04 17:51:17.857 [INFO][3879] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" HandleID="k8s-pod-network.5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" Workload="localhost-k8s-whisker--54f456cb4--6p4xx-eth0" Sep 4 17:51:17.918470 containerd[1564]: 2025-09-04 17:51:17.857 [INFO][3879] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" HandleID="k8s-pod-network.5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" Workload="localhost-k8s-whisker--54f456cb4--6p4xx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000407170), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-54f456cb4-6p4xx", "timestamp":"2025-09-04 17:51:17.857051209 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:51:17.918470 containerd[1564]: 2025-09-04 17:51:17.857 [INFO][3879] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 17:51:17.918470 containerd[1564]: 2025-09-04 17:51:17.857 [INFO][3879] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 17:51:17.918470 containerd[1564]: 2025-09-04 17:51:17.858 [INFO][3879] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 17:51:17.918470 containerd[1564]: 2025-09-04 17:51:17.865 [INFO][3879] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" host="localhost" Sep 4 17:51:17.918470 containerd[1564]: 2025-09-04 17:51:17.871 [INFO][3879] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 17:51:17.918470 containerd[1564]: 2025-09-04 17:51:17.875 [INFO][3879] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 17:51:17.918470 containerd[1564]: 2025-09-04 17:51:17.877 [INFO][3879] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:17.918470 containerd[1564]: 2025-09-04 17:51:17.879 [INFO][3879] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:17.918470 containerd[1564]: 2025-09-04 17:51:17.879 [INFO][3879] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" host="localhost" Sep 4 17:51:17.918684 containerd[1564]: 2025-09-04 17:51:17.880 [INFO][3879] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6 Sep 4 17:51:17.918684 containerd[1564]: 2025-09-04 17:51:17.886 [INFO][3879] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" host="localhost" Sep 4 17:51:17.918684 containerd[1564]: 2025-09-04 17:51:17.891 [INFO][3879] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" host="localhost" Sep 4 17:51:17.918684 containerd[1564]: 2025-09-04 17:51:17.891 [INFO][3879] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" host="localhost" Sep 4 17:51:17.918684 containerd[1564]: 2025-09-04 17:51:17.891 [INFO][3879] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 17:51:17.918684 containerd[1564]: 2025-09-04 17:51:17.891 [INFO][3879] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" HandleID="k8s-pod-network.5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" Workload="localhost-k8s-whisker--54f456cb4--6p4xx-eth0" Sep 4 17:51:17.918812 containerd[1564]: 2025-09-04 17:51:17.895 [INFO][3866] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" Namespace="calico-system" Pod="whisker-54f456cb4-6p4xx" WorkloadEndpoint="localhost-k8s-whisker--54f456cb4--6p4xx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--54f456cb4--6p4xx-eth0", GenerateName:"whisker-54f456cb4-", Namespace:"calico-system", SelfLink:"", UID:"b05f926c-4f67-49c6-b5a1-94e1657f90fa", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 51, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"54f456cb4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-54f456cb4-6p4xx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid46347b0270", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:17.918812 containerd[1564]: 2025-09-04 17:51:17.895 [INFO][3866] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" Namespace="calico-system" Pod="whisker-54f456cb4-6p4xx" WorkloadEndpoint="localhost-k8s-whisker--54f456cb4--6p4xx-eth0" Sep 4 17:51:17.918884 containerd[1564]: 2025-09-04 17:51:17.895 [INFO][3866] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid46347b0270 ContainerID="5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" Namespace="calico-system" Pod="whisker-54f456cb4-6p4xx" WorkloadEndpoint="localhost-k8s-whisker--54f456cb4--6p4xx-eth0" Sep 4 17:51:17.918884 containerd[1564]: 2025-09-04 17:51:17.904 [INFO][3866] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" Namespace="calico-system" Pod="whisker-54f456cb4-6p4xx" WorkloadEndpoint="localhost-k8s-whisker--54f456cb4--6p4xx-eth0" Sep 4 17:51:17.918925 containerd[1564]: 2025-09-04 17:51:17.905 [INFO][3866] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" Namespace="calico-system" Pod="whisker-54f456cb4-6p4xx" WorkloadEndpoint="localhost-k8s-whisker--54f456cb4--6p4xx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--54f456cb4--6p4xx-eth0", GenerateName:"whisker-54f456cb4-", Namespace:"calico-system", SelfLink:"", UID:"b05f926c-4f67-49c6-b5a1-94e1657f90fa", ResourceVersion:"872", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 51, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"54f456cb4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6", Pod:"whisker-54f456cb4-6p4xx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calid46347b0270", MAC:"d2:16:aa:d2:bd:21", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:17.918983 containerd[1564]: 2025-09-04 17:51:17.913 [INFO][3866] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" Namespace="calico-system" Pod="whisker-54f456cb4-6p4xx" WorkloadEndpoint="localhost-k8s-whisker--54f456cb4--6p4xx-eth0" Sep 4 17:51:18.501386 containerd[1564]: time="2025-09-04T17:51:18.501330747Z" level=info msg="TaskExit event in podsandbox handler container_id:\"185689874e428c9987e558bee86d514a724b8f5f5f4674f5cc23d74e0fa1b3bf\" id:\"da171cab38df2bee334a468975e3aad1006ea7498fc22668deb11848c19acd82\" pid:4000 exit_status:1 exited_at:{seconds:1757008278 nanos:500658874}" Sep 4 17:51:18.800875 systemd-networkd[1492]: vxlan.calico: Link UP Sep 4 17:51:18.800888 systemd-networkd[1492]: vxlan.calico: Gained carrier Sep 4 17:51:18.840943 containerd[1564]: time="2025-09-04T17:51:18.840889747Z" level=info msg="connecting to shim 5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6" address="unix:///run/containerd/s/81a80aea8cc437347d2ae9a0c20e2731bfafcb35c29e9ecadab326960d7f8bc5" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:51:18.872243 systemd[1]: Started cri-containerd-5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6.scope - libcontainer container 5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6. Sep 4 17:51:18.886437 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 17:51:18.935948 containerd[1564]: time="2025-09-04T17:51:18.935893635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54f456cb4-6p4xx,Uid:b05f926c-4f67-49c6-b5a1-94e1657f90fa,Namespace:calico-system,Attempt:0,} returns sandbox id \"5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6\"" Sep 4 17:51:18.938174 containerd[1564]: time="2025-09-04T17:51:18.937921026Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 4 17:51:19.024480 systemd[1]: Started sshd@7-10.0.0.60:22-10.0.0.1:52320.service - OpenSSH per-connection server daemon (10.0.0.1:52320). Sep 4 17:51:19.106055 sshd[4134]: Accepted publickey for core from 10.0.0.1 port 52320 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:19.107023 sshd-session[4134]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:19.115380 systemd-logind[1545]: New session 8 of user core. Sep 4 17:51:19.121267 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 4 17:51:19.241886 containerd[1564]: time="2025-09-04T17:51:19.241778733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf46546c9-zlqdb,Uid:b01e23fd-381d-49b0-b985-5edc30fb4402,Namespace:calico-apiserver,Attempt:0,}" Sep 4 17:51:19.247199 kubelet[2704]: I0904 17:51:19.247133 2704 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de06474-82ff-4369-8fcf-667fedc2e13c" path="/var/lib/kubelet/pods/6de06474-82ff-4369-8fcf-667fedc2e13c/volumes" Sep 4 17:51:19.280998 sshd[4165]: Connection closed by 10.0.0.1 port 52320 Sep 4 17:51:19.282278 sshd-session[4134]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:19.288626 systemd[1]: sshd@7-10.0.0.60:22-10.0.0.1:52320.service: Deactivated successfully. Sep 4 17:51:19.291574 systemd[1]: session-8.scope: Deactivated successfully. Sep 4 17:51:19.292926 systemd-logind[1545]: Session 8 logged out. Waiting for processes to exit. Sep 4 17:51:19.294336 systemd-logind[1545]: Removed session 8. Sep 4 17:51:19.346512 systemd-networkd[1492]: calia37329672df: Link UP Sep 4 17:51:19.347449 systemd-networkd[1492]: calia37329672df: Gained carrier Sep 4 17:51:19.385211 containerd[1564]: 2025-09-04 17:51:19.286 [INFO][4179] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0 calico-apiserver-5bf46546c9- calico-apiserver b01e23fd-381d-49b0-b985-5edc30fb4402 794 0 2025-09-04 17:50:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5bf46546c9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5bf46546c9-zlqdb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia37329672df [] [] }} ContainerID="c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-zlqdb" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-" Sep 4 17:51:19.385211 containerd[1564]: 2025-09-04 17:51:19.286 [INFO][4179] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-zlqdb" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0" Sep 4 17:51:19.385211 containerd[1564]: 2025-09-04 17:51:19.313 [INFO][4197] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" HandleID="k8s-pod-network.c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" Workload="localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0" Sep 4 17:51:19.385450 containerd[1564]: 2025-09-04 17:51:19.314 [INFO][4197] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" HandleID="k8s-pod-network.c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" Workload="localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c7010), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5bf46546c9-zlqdb", "timestamp":"2025-09-04 17:51:19.313825515 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:51:19.385450 containerd[1564]: 2025-09-04 17:51:19.314 [INFO][4197] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 17:51:19.385450 containerd[1564]: 2025-09-04 17:51:19.314 [INFO][4197] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 17:51:19.385450 containerd[1564]: 2025-09-04 17:51:19.314 [INFO][4197] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 17:51:19.385450 containerd[1564]: 2025-09-04 17:51:19.319 [INFO][4197] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" host="localhost" Sep 4 17:51:19.385450 containerd[1564]: 2025-09-04 17:51:19.324 [INFO][4197] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 17:51:19.385450 containerd[1564]: 2025-09-04 17:51:19.328 [INFO][4197] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 17:51:19.385450 containerd[1564]: 2025-09-04 17:51:19.329 [INFO][4197] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:19.385450 containerd[1564]: 2025-09-04 17:51:19.330 [INFO][4197] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:19.385450 containerd[1564]: 2025-09-04 17:51:19.330 [INFO][4197] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" host="localhost" Sep 4 17:51:19.385739 containerd[1564]: 2025-09-04 17:51:19.332 [INFO][4197] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848 Sep 4 17:51:19.385739 containerd[1564]: 2025-09-04 17:51:19.336 [INFO][4197] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" host="localhost" Sep 4 17:51:19.385739 containerd[1564]: 2025-09-04 17:51:19.340 [INFO][4197] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" host="localhost" Sep 4 17:51:19.385739 containerd[1564]: 2025-09-04 17:51:19.340 [INFO][4197] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" host="localhost" Sep 4 17:51:19.385739 containerd[1564]: 2025-09-04 17:51:19.340 [INFO][4197] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 17:51:19.385739 containerd[1564]: 2025-09-04 17:51:19.340 [INFO][4197] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" HandleID="k8s-pod-network.c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" Workload="localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0" Sep 4 17:51:19.385865 containerd[1564]: 2025-09-04 17:51:19.344 [INFO][4179] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-zlqdb" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0", GenerateName:"calico-apiserver-5bf46546c9-", Namespace:"calico-apiserver", SelfLink:"", UID:"b01e23fd-381d-49b0-b985-5edc30fb4402", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5bf46546c9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5bf46546c9-zlqdb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia37329672df", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:19.385931 containerd[1564]: 2025-09-04 17:51:19.344 [INFO][4179] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-zlqdb" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0" Sep 4 17:51:19.385931 containerd[1564]: 2025-09-04 17:51:19.344 [INFO][4179] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia37329672df ContainerID="c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-zlqdb" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0" Sep 4 17:51:19.385931 containerd[1564]: 2025-09-04 17:51:19.347 [INFO][4179] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-zlqdb" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0" Sep 4 17:51:19.385994 containerd[1564]: 2025-09-04 17:51:19.347 [INFO][4179] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-zlqdb" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0", GenerateName:"calico-apiserver-5bf46546c9-", Namespace:"calico-apiserver", SelfLink:"", UID:"b01e23fd-381d-49b0-b985-5edc30fb4402", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5bf46546c9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848", Pod:"calico-apiserver-5bf46546c9-zlqdb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia37329672df", MAC:"9a:e4:b7:b8:9d:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:19.386044 containerd[1564]: 2025-09-04 17:51:19.381 [INFO][4179] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-zlqdb" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--zlqdb-eth0" Sep 4 17:51:19.411007 containerd[1564]: time="2025-09-04T17:51:19.409680034Z" level=info msg="connecting to shim c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848" address="unix:///run/containerd/s/b3488823e21b132a4a23b7bf371876aff1b40f9e952809952426af2341ef93da" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:51:19.436280 systemd[1]: Started cri-containerd-c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848.scope - libcontainer container c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848. Sep 4 17:51:19.450246 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 17:51:19.483100 containerd[1564]: time="2025-09-04T17:51:19.483059151Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf46546c9-zlqdb,Uid:b01e23fd-381d-49b0-b985-5edc30fb4402,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848\"" Sep 4 17:51:19.587288 systemd-networkd[1492]: calid46347b0270: Gained IPv6LL Sep 4 17:51:19.907262 systemd-networkd[1492]: vxlan.calico: Gained IPv6LL Sep 4 17:51:20.242061 containerd[1564]: time="2025-09-04T17:51:20.241914829Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-7f6s9,Uid:79f9dffd-1bb2-43ab-ba96-6a7b6baab522,Namespace:calico-system,Attempt:0,}" Sep 4 17:51:20.242549 containerd[1564]: time="2025-09-04T17:51:20.242086090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7bbsj,Uid:1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34,Namespace:kube-system,Attempt:0,}" Sep 4 17:51:20.242549 containerd[1564]: time="2025-09-04T17:51:20.241915520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8mncz,Uid:81fd4b06-1bc3-423c-9205-8768e1e4b44a,Namespace:calico-system,Attempt:0,}" Sep 4 17:51:20.378049 systemd-networkd[1492]: cali59d2159daf1: Link UP Sep 4 17:51:20.378688 systemd-networkd[1492]: cali59d2159daf1: Gained carrier Sep 4 17:51:20.393700 containerd[1564]: 2025-09-04 17:51:20.294 [INFO][4280] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--8mncz-eth0 csi-node-driver- calico-system 81fd4b06-1bc3-423c-9205-8768e1e4b44a 684 0 2025-09-04 17:50:54 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-8mncz eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali59d2159daf1 [] [] }} ContainerID="92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" Namespace="calico-system" Pod="csi-node-driver-8mncz" WorkloadEndpoint="localhost-k8s-csi--node--driver--8mncz-" Sep 4 17:51:20.393700 containerd[1564]: 2025-09-04 17:51:20.294 [INFO][4280] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" Namespace="calico-system" Pod="csi-node-driver-8mncz" WorkloadEndpoint="localhost-k8s-csi--node--driver--8mncz-eth0" Sep 4 17:51:20.393700 containerd[1564]: 2025-09-04 17:51:20.335 [INFO][4313] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" HandleID="k8s-pod-network.92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" Workload="localhost-k8s-csi--node--driver--8mncz-eth0" Sep 4 17:51:20.394231 containerd[1564]: 2025-09-04 17:51:20.335 [INFO][4313] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" HandleID="k8s-pod-network.92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" Workload="localhost-k8s-csi--node--driver--8mncz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e7020), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-8mncz", "timestamp":"2025-09-04 17:51:20.335042917 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:51:20.394231 containerd[1564]: 2025-09-04 17:51:20.335 [INFO][4313] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 17:51:20.394231 containerd[1564]: 2025-09-04 17:51:20.335 [INFO][4313] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 17:51:20.394231 containerd[1564]: 2025-09-04 17:51:20.335 [INFO][4313] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 17:51:20.394231 containerd[1564]: 2025-09-04 17:51:20.344 [INFO][4313] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" host="localhost" Sep 4 17:51:20.394231 containerd[1564]: 2025-09-04 17:51:20.351 [INFO][4313] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 17:51:20.394231 containerd[1564]: 2025-09-04 17:51:20.356 [INFO][4313] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 17:51:20.394231 containerd[1564]: 2025-09-04 17:51:20.358 [INFO][4313] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:20.394231 containerd[1564]: 2025-09-04 17:51:20.360 [INFO][4313] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:20.394231 containerd[1564]: 2025-09-04 17:51:20.360 [INFO][4313] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" host="localhost" Sep 4 17:51:20.394678 containerd[1564]: 2025-09-04 17:51:20.361 [INFO][4313] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9 Sep 4 17:51:20.394678 containerd[1564]: 2025-09-04 17:51:20.365 [INFO][4313] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" host="localhost" Sep 4 17:51:20.394678 containerd[1564]: 2025-09-04 17:51:20.371 [INFO][4313] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" host="localhost" Sep 4 17:51:20.394678 containerd[1564]: 2025-09-04 17:51:20.371 [INFO][4313] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" host="localhost" Sep 4 17:51:20.394678 containerd[1564]: 2025-09-04 17:51:20.371 [INFO][4313] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 17:51:20.394678 containerd[1564]: 2025-09-04 17:51:20.371 [INFO][4313] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" HandleID="k8s-pod-network.92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" Workload="localhost-k8s-csi--node--driver--8mncz-eth0" Sep 4 17:51:20.394946 containerd[1564]: 2025-09-04 17:51:20.375 [INFO][4280] cni-plugin/k8s.go 418: Populated endpoint ContainerID="92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" Namespace="calico-system" Pod="csi-node-driver-8mncz" WorkloadEndpoint="localhost-k8s-csi--node--driver--8mncz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--8mncz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"81fd4b06-1bc3-423c-9205-8768e1e4b44a", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-8mncz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali59d2159daf1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:20.395003 containerd[1564]: 2025-09-04 17:51:20.375 [INFO][4280] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" Namespace="calico-system" Pod="csi-node-driver-8mncz" WorkloadEndpoint="localhost-k8s-csi--node--driver--8mncz-eth0" Sep 4 17:51:20.395003 containerd[1564]: 2025-09-04 17:51:20.375 [INFO][4280] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali59d2159daf1 ContainerID="92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" Namespace="calico-system" Pod="csi-node-driver-8mncz" WorkloadEndpoint="localhost-k8s-csi--node--driver--8mncz-eth0" Sep 4 17:51:20.395003 containerd[1564]: 2025-09-04 17:51:20.379 [INFO][4280] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" Namespace="calico-system" Pod="csi-node-driver-8mncz" WorkloadEndpoint="localhost-k8s-csi--node--driver--8mncz-eth0" Sep 4 17:51:20.395073 containerd[1564]: 2025-09-04 17:51:20.379 [INFO][4280] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" Namespace="calico-system" Pod="csi-node-driver-8mncz" WorkloadEndpoint="localhost-k8s-csi--node--driver--8mncz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--8mncz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"81fd4b06-1bc3-423c-9205-8768e1e4b44a", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9", Pod:"csi-node-driver-8mncz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali59d2159daf1", MAC:"0a:4b:e0:0d:f4:b4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:20.395179 containerd[1564]: 2025-09-04 17:51:20.387 [INFO][4280] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" Namespace="calico-system" Pod="csi-node-driver-8mncz" WorkloadEndpoint="localhost-k8s-csi--node--driver--8mncz-eth0" Sep 4 17:51:20.414804 containerd[1564]: time="2025-09-04T17:51:20.414746323Z" level=info msg="connecting to shim 92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9" address="unix:///run/containerd/s/0f8a0e83d16eb286758ead8e9d75040a6e80cc25c36cfdefd795a1ba1a9b02b1" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:51:20.443304 systemd[1]: Started cri-containerd-92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9.scope - libcontainer container 92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9. Sep 4 17:51:20.461281 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 17:51:20.481875 containerd[1564]: time="2025-09-04T17:51:20.481807004Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8mncz,Uid:81fd4b06-1bc3-423c-9205-8768e1e4b44a,Namespace:calico-system,Attempt:0,} returns sandbox id \"92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9\"" Sep 4 17:51:20.485761 systemd-networkd[1492]: cali071b7c2c6f9: Link UP Sep 4 17:51:20.486783 systemd-networkd[1492]: cali071b7c2c6f9: Gained carrier Sep 4 17:51:20.501018 containerd[1564]: 2025-09-04 17:51:20.292 [INFO][4270] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0 coredns-7c65d6cfc9- kube-system 1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34 798 0 2025-09-04 17:50:43 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-7bbsj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali071b7c2c6f9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7bbsj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7bbsj-" Sep 4 17:51:20.501018 containerd[1564]: 2025-09-04 17:51:20.292 [INFO][4270] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7bbsj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0" Sep 4 17:51:20.501018 containerd[1564]: 2025-09-04 17:51:20.344 [INFO][4306] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" HandleID="k8s-pod-network.04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" Workload="localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0" Sep 4 17:51:20.501255 containerd[1564]: 2025-09-04 17:51:20.344 [INFO][4306] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" HandleID="k8s-pod-network.04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" Workload="localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df0f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-7bbsj", "timestamp":"2025-09-04 17:51:20.342804456 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:51:20.501255 containerd[1564]: 2025-09-04 17:51:20.344 [INFO][4306] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 17:51:20.501255 containerd[1564]: 2025-09-04 17:51:20.371 [INFO][4306] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 17:51:20.501255 containerd[1564]: 2025-09-04 17:51:20.371 [INFO][4306] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 17:51:20.501255 containerd[1564]: 2025-09-04 17:51:20.446 [INFO][4306] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" host="localhost" Sep 4 17:51:20.501255 containerd[1564]: 2025-09-04 17:51:20.452 [INFO][4306] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 17:51:20.501255 containerd[1564]: 2025-09-04 17:51:20.456 [INFO][4306] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 17:51:20.501255 containerd[1564]: 2025-09-04 17:51:20.458 [INFO][4306] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:20.501255 containerd[1564]: 2025-09-04 17:51:20.461 [INFO][4306] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:20.501255 containerd[1564]: 2025-09-04 17:51:20.462 [INFO][4306] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" host="localhost" Sep 4 17:51:20.501514 containerd[1564]: 2025-09-04 17:51:20.463 [INFO][4306] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc Sep 4 17:51:20.501514 containerd[1564]: 2025-09-04 17:51:20.468 [INFO][4306] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" host="localhost" Sep 4 17:51:20.501514 containerd[1564]: 2025-09-04 17:51:20.474 [INFO][4306] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" host="localhost" Sep 4 17:51:20.501514 containerd[1564]: 2025-09-04 17:51:20.474 [INFO][4306] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" host="localhost" Sep 4 17:51:20.501514 containerd[1564]: 2025-09-04 17:51:20.475 [INFO][4306] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 17:51:20.501514 containerd[1564]: 2025-09-04 17:51:20.475 [INFO][4306] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" HandleID="k8s-pod-network.04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" Workload="localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0" Sep 4 17:51:20.501719 containerd[1564]: 2025-09-04 17:51:20.480 [INFO][4270] cni-plugin/k8s.go 418: Populated endpoint ContainerID="04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7bbsj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-7bbsj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali071b7c2c6f9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:20.501795 containerd[1564]: 2025-09-04 17:51:20.480 [INFO][4270] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7bbsj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0" Sep 4 17:51:20.501795 containerd[1564]: 2025-09-04 17:51:20.480 [INFO][4270] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali071b7c2c6f9 ContainerID="04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7bbsj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0" Sep 4 17:51:20.501795 containerd[1564]: 2025-09-04 17:51:20.487 [INFO][4270] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7bbsj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0" Sep 4 17:51:20.501928 containerd[1564]: 2025-09-04 17:51:20.487 [INFO][4270] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7bbsj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc", Pod:"coredns-7c65d6cfc9-7bbsj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali071b7c2c6f9", MAC:"c2:ed:06:77:53:68", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:20.501928 containerd[1564]: 2025-09-04 17:51:20.497 [INFO][4270] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7bbsj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7bbsj-eth0" Sep 4 17:51:20.526961 containerd[1564]: time="2025-09-04T17:51:20.526914591Z" level=info msg="connecting to shim 04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc" address="unix:///run/containerd/s/f0d1e12f1890f9ce59264b3995652d2ab4e369e60d4eef30222039ce218a5ef9" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:51:20.550302 systemd[1]: Started cri-containerd-04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc.scope - libcontainer container 04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc. Sep 4 17:51:20.567009 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 17:51:20.581339 systemd-networkd[1492]: cali91efb56b8f1: Link UP Sep 4 17:51:20.582576 systemd-networkd[1492]: cali91efb56b8f1: Gained carrier Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.302 [INFO][4262] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--7988f88666--7f6s9-eth0 goldmane-7988f88666- calico-system 79f9dffd-1bb2-43ab-ba96-6a7b6baab522 799 0 2025-09-04 17:50:53 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-7988f88666-7f6s9 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali91efb56b8f1 [] [] }} ContainerID="8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" Namespace="calico-system" Pod="goldmane-7988f88666-7f6s9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--7f6s9-" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.302 [INFO][4262] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" Namespace="calico-system" Pod="goldmane-7988f88666-7f6s9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--7f6s9-eth0" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.351 [INFO][4321] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" HandleID="k8s-pod-network.8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" Workload="localhost-k8s-goldmane--7988f88666--7f6s9-eth0" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.351 [INFO][4321] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" HandleID="k8s-pod-network.8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" Workload="localhost-k8s-goldmane--7988f88666--7f6s9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ff700), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-7988f88666-7f6s9", "timestamp":"2025-09-04 17:51:20.35154711 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.351 [INFO][4321] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.474 [INFO][4321] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.474 [INFO][4321] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.547 [INFO][4321] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" host="localhost" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.552 [INFO][4321] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.557 [INFO][4321] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.558 [INFO][4321] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.560 [INFO][4321] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.560 [INFO][4321] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" host="localhost" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.562 [INFO][4321] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.566 [INFO][4321] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" host="localhost" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.573 [INFO][4321] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" host="localhost" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.573 [INFO][4321] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" host="localhost" Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.573 [INFO][4321] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 17:51:20.601510 containerd[1564]: 2025-09-04 17:51:20.573 [INFO][4321] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" HandleID="k8s-pod-network.8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" Workload="localhost-k8s-goldmane--7988f88666--7f6s9-eth0" Sep 4 17:51:20.602075 containerd[1564]: 2025-09-04 17:51:20.578 [INFO][4262] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" Namespace="calico-system" Pod="goldmane-7988f88666-7f6s9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--7f6s9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--7f6s9-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"79f9dffd-1bb2-43ab-ba96-6a7b6baab522", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-7988f88666-7f6s9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali91efb56b8f1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:20.602075 containerd[1564]: 2025-09-04 17:51:20.578 [INFO][4262] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" Namespace="calico-system" Pod="goldmane-7988f88666-7f6s9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--7f6s9-eth0" Sep 4 17:51:20.602075 containerd[1564]: 2025-09-04 17:51:20.578 [INFO][4262] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali91efb56b8f1 ContainerID="8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" Namespace="calico-system" Pod="goldmane-7988f88666-7f6s9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--7f6s9-eth0" Sep 4 17:51:20.602075 containerd[1564]: 2025-09-04 17:51:20.582 [INFO][4262] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" Namespace="calico-system" Pod="goldmane-7988f88666-7f6s9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--7f6s9-eth0" Sep 4 17:51:20.602075 containerd[1564]: 2025-09-04 17:51:20.583 [INFO][4262] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" Namespace="calico-system" Pod="goldmane-7988f88666-7f6s9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--7f6s9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--7988f88666--7f6s9-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"79f9dffd-1bb2-43ab-ba96-6a7b6baab522", ResourceVersion:"799", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c", Pod:"goldmane-7988f88666-7f6s9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali91efb56b8f1", MAC:"2e:0b:4c:34:91:e1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:20.602075 containerd[1564]: 2025-09-04 17:51:20.594 [INFO][4262] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" Namespace="calico-system" Pod="goldmane-7988f88666-7f6s9" WorkloadEndpoint="localhost-k8s-goldmane--7988f88666--7f6s9-eth0" Sep 4 17:51:20.611050 containerd[1564]: time="2025-09-04T17:51:20.610996486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7bbsj,Uid:1cd8d6fb-6cf4-4a7d-af5a-3965e83c2a34,Namespace:kube-system,Attempt:0,} returns sandbox id \"04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc\"" Sep 4 17:51:20.614844 containerd[1564]: time="2025-09-04T17:51:20.614799995Z" level=info msg="CreateContainer within sandbox \"04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 17:51:20.626134 containerd[1564]: time="2025-09-04T17:51:20.624647224Z" level=info msg="Container d3482b3986475617542775221395b6bba81db38f1f437f5574b6368b47772ba8: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:20.629606 containerd[1564]: time="2025-09-04T17:51:20.629551433Z" level=info msg="connecting to shim 8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c" address="unix:///run/containerd/s/f144fbd95990670237aa1682cfc0e118408a066b64c0500c31d08b37ce2f585e" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:51:20.631730 containerd[1564]: time="2025-09-04T17:51:20.631690263Z" level=info msg="CreateContainer within sandbox \"04baa8e919dbe50bc75f094bb84ab0b65a03d81cfe2f225756ecaec4e3a176bc\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d3482b3986475617542775221395b6bba81db38f1f437f5574b6368b47772ba8\"" Sep 4 17:51:20.632401 containerd[1564]: time="2025-09-04T17:51:20.632356125Z" level=info msg="StartContainer for \"d3482b3986475617542775221395b6bba81db38f1f437f5574b6368b47772ba8\"" Sep 4 17:51:20.633957 containerd[1564]: time="2025-09-04T17:51:20.633736700Z" level=info msg="connecting to shim d3482b3986475617542775221395b6bba81db38f1f437f5574b6368b47772ba8" address="unix:///run/containerd/s/f0d1e12f1890f9ce59264b3995652d2ab4e369e60d4eef30222039ce218a5ef9" protocol=ttrpc version=3 Sep 4 17:51:20.658277 systemd[1]: Started cri-containerd-8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c.scope - libcontainer container 8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c. Sep 4 17:51:20.659709 systemd[1]: Started cri-containerd-d3482b3986475617542775221395b6bba81db38f1f437f5574b6368b47772ba8.scope - libcontainer container d3482b3986475617542775221395b6bba81db38f1f437f5574b6368b47772ba8. Sep 4 17:51:20.685691 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 17:51:20.703048 containerd[1564]: time="2025-09-04T17:51:20.703007415Z" level=info msg="StartContainer for \"d3482b3986475617542775221395b6bba81db38f1f437f5574b6368b47772ba8\" returns successfully" Sep 4 17:51:20.730093 containerd[1564]: time="2025-09-04T17:51:20.730051028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-7f6s9,Uid:79f9dffd-1bb2-43ab-ba96-6a7b6baab522,Namespace:calico-system,Attempt:0,} returns sandbox id \"8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c\"" Sep 4 17:51:20.816343 containerd[1564]: time="2025-09-04T17:51:20.816215808Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:20.817264 containerd[1564]: time="2025-09-04T17:51:20.817235254Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 4 17:51:20.818596 containerd[1564]: time="2025-09-04T17:51:20.818565295Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:20.820525 containerd[1564]: time="2025-09-04T17:51:20.820475295Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:20.821024 containerd[1564]: time="2025-09-04T17:51:20.820996845Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.88304982s" Sep 4 17:51:20.821056 containerd[1564]: time="2025-09-04T17:51:20.821025238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 4 17:51:20.822023 containerd[1564]: time="2025-09-04T17:51:20.821940349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 17:51:20.822967 containerd[1564]: time="2025-09-04T17:51:20.822941951Z" level=info msg="CreateContainer within sandbox \"5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 4 17:51:20.830366 containerd[1564]: time="2025-09-04T17:51:20.830336911Z" level=info msg="Container 867737e660dfcd3e6e166889658b2873e93602c8446ce01f7d9eb009bf5ccd46: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:20.837158 containerd[1564]: time="2025-09-04T17:51:20.837135991Z" level=info msg="CreateContainer within sandbox \"5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"867737e660dfcd3e6e166889658b2873e93602c8446ce01f7d9eb009bf5ccd46\"" Sep 4 17:51:20.837519 containerd[1564]: time="2025-09-04T17:51:20.837489064Z" level=info msg="StartContainer for \"867737e660dfcd3e6e166889658b2873e93602c8446ce01f7d9eb009bf5ccd46\"" Sep 4 17:51:20.838423 containerd[1564]: time="2025-09-04T17:51:20.838403103Z" level=info msg="connecting to shim 867737e660dfcd3e6e166889658b2873e93602c8446ce01f7d9eb009bf5ccd46" address="unix:///run/containerd/s/81a80aea8cc437347d2ae9a0c20e2731bfafcb35c29e9ecadab326960d7f8bc5" protocol=ttrpc version=3 Sep 4 17:51:20.859261 systemd[1]: Started cri-containerd-867737e660dfcd3e6e166889658b2873e93602c8446ce01f7d9eb009bf5ccd46.scope - libcontainer container 867737e660dfcd3e6e166889658b2873e93602c8446ce01f7d9eb009bf5ccd46. Sep 4 17:51:20.903967 containerd[1564]: time="2025-09-04T17:51:20.903929991Z" level=info msg="StartContainer for \"867737e660dfcd3e6e166889658b2873e93602c8446ce01f7d9eb009bf5ccd46\" returns successfully" Sep 4 17:51:21.315807 systemd-networkd[1492]: calia37329672df: Gained IPv6LL Sep 4 17:51:21.388073 kubelet[2704]: I0904 17:51:21.388001 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-7bbsj" podStartSLOduration=38.386815937 podStartE2EDuration="38.386815937s" podCreationTimestamp="2025-09-04 17:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 17:51:21.386440131 +0000 UTC m=+44.243603988" watchObservedRunningTime="2025-09-04 17:51:21.386815937 +0000 UTC m=+44.243979794" Sep 4 17:51:21.635321 systemd-networkd[1492]: cali071b7c2c6f9: Gained IPv6LL Sep 4 17:51:21.700292 systemd-networkd[1492]: cali91efb56b8f1: Gained IPv6LL Sep 4 17:51:21.891289 systemd-networkd[1492]: cali59d2159daf1: Gained IPv6LL Sep 4 17:51:22.241555 containerd[1564]: time="2025-09-04T17:51:22.241381272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lxxpj,Uid:a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1,Namespace:kube-system,Attempt:0,}" Sep 4 17:51:22.241555 containerd[1564]: time="2025-09-04T17:51:22.241483965Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cfb9fdb56-hqdqm,Uid:76446953-bc18-4027-94fd-31764ac45951,Namespace:calico-system,Attempt:0,}" Sep 4 17:51:22.242084 containerd[1564]: time="2025-09-04T17:51:22.241391541Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf46546c9-6pt6w,Uid:500d5572-b816-409e-8956-3b59c87319da,Namespace:calico-apiserver,Attempt:0,}" Sep 4 17:51:22.446607 systemd-networkd[1492]: calic7f4fd7e9c2: Link UP Sep 4 17:51:22.449335 systemd-networkd[1492]: calic7f4fd7e9c2: Gained carrier Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.298 [INFO][4575] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0 calico-kube-controllers-5cfb9fdb56- calico-system 76446953-bc18-4027-94fd-31764ac45951 796 0 2025-09-04 17:50:54 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5cfb9fdb56 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-5cfb9fdb56-hqdqm eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic7f4fd7e9c2 [] [] }} ContainerID="cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" Namespace="calico-system" Pod="calico-kube-controllers-5cfb9fdb56-hqdqm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.298 [INFO][4575] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" Namespace="calico-system" Pod="calico-kube-controllers-5cfb9fdb56-hqdqm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.340 [INFO][4615] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" HandleID="k8s-pod-network.cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" Workload="localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.340 [INFO][4615] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" HandleID="k8s-pod-network.cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" Workload="localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00018e9e0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-5cfb9fdb56-hqdqm", "timestamp":"2025-09-04 17:51:22.340740837 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.340 [INFO][4615] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.340 [INFO][4615] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.341 [INFO][4615] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.349 [INFO][4615] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" host="localhost" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.357 [INFO][4615] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.363 [INFO][4615] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.366 [INFO][4615] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.372 [INFO][4615] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.372 [INFO][4615] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" host="localhost" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.373 [INFO][4615] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.379 [INFO][4615] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" host="localhost" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.405 [INFO][4615] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" host="localhost" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.408 [INFO][4615] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" host="localhost" Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.408 [INFO][4615] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 17:51:22.479168 containerd[1564]: 2025-09-04 17:51:22.408 [INFO][4615] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" HandleID="k8s-pod-network.cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" Workload="localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0" Sep 4 17:51:22.479763 containerd[1564]: 2025-09-04 17:51:22.429 [INFO][4575] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" Namespace="calico-system" Pod="calico-kube-controllers-5cfb9fdb56-hqdqm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0", GenerateName:"calico-kube-controllers-5cfb9fdb56-", Namespace:"calico-system", SelfLink:"", UID:"76446953-bc18-4027-94fd-31764ac45951", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cfb9fdb56", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-5cfb9fdb56-hqdqm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic7f4fd7e9c2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:22.479763 containerd[1564]: 2025-09-04 17:51:22.429 [INFO][4575] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" Namespace="calico-system" Pod="calico-kube-controllers-5cfb9fdb56-hqdqm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0" Sep 4 17:51:22.479763 containerd[1564]: 2025-09-04 17:51:22.429 [INFO][4575] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic7f4fd7e9c2 ContainerID="cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" Namespace="calico-system" Pod="calico-kube-controllers-5cfb9fdb56-hqdqm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0" Sep 4 17:51:22.479763 containerd[1564]: 2025-09-04 17:51:22.450 [INFO][4575] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" Namespace="calico-system" Pod="calico-kube-controllers-5cfb9fdb56-hqdqm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0" Sep 4 17:51:22.479763 containerd[1564]: 2025-09-04 17:51:22.452 [INFO][4575] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" Namespace="calico-system" Pod="calico-kube-controllers-5cfb9fdb56-hqdqm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0", GenerateName:"calico-kube-controllers-5cfb9fdb56-", Namespace:"calico-system", SelfLink:"", UID:"76446953-bc18-4027-94fd-31764ac45951", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cfb9fdb56", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce", Pod:"calico-kube-controllers-5cfb9fdb56-hqdqm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic7f4fd7e9c2", MAC:"ca:46:0e:64:85:18", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:22.479763 containerd[1564]: 2025-09-04 17:51:22.469 [INFO][4575] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" Namespace="calico-system" Pod="calico-kube-controllers-5cfb9fdb56-hqdqm" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cfb9fdb56--hqdqm-eth0" Sep 4 17:51:22.518374 containerd[1564]: time="2025-09-04T17:51:22.518171762Z" level=info msg="connecting to shim cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce" address="unix:///run/containerd/s/73eda9a19a9e56dfca64f9a77e4f920d0bbea593b8bb60ded6add4186547109d" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:51:22.526170 systemd-networkd[1492]: caliddd0c66d4d1: Link UP Sep 4 17:51:22.527513 systemd-networkd[1492]: caliddd0c66d4d1: Gained carrier Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.357 [INFO][4574] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0 coredns-7c65d6cfc9- kube-system a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1 788 0 2025-09-04 17:50:43 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-lxxpj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliddd0c66d4d1 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lxxpj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lxxpj-" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.358 [INFO][4574] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lxxpj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.410 [INFO][4629] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" HandleID="k8s-pod-network.839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" Workload="localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.410 [INFO][4629] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" HandleID="k8s-pod-network.839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" Workload="localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001396e0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-lxxpj", "timestamp":"2025-09-04 17:51:22.410710442 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.410 [INFO][4629] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.411 [INFO][4629] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.411 [INFO][4629] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.453 [INFO][4629] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" host="localhost" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.474 [INFO][4629] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.486 [INFO][4629] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.488 [INFO][4629] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.492 [INFO][4629] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.493 [INFO][4629] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" host="localhost" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.497 [INFO][4629] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.502 [INFO][4629] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" host="localhost" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.511 [INFO][4629] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" host="localhost" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.511 [INFO][4629] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" host="localhost" Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.511 [INFO][4629] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 17:51:22.551218 containerd[1564]: 2025-09-04 17:51:22.511 [INFO][4629] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" HandleID="k8s-pod-network.839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" Workload="localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0" Sep 4 17:51:22.551942 containerd[1564]: 2025-09-04 17:51:22.516 [INFO][4574] cni-plugin/k8s.go 418: Populated endpoint ContainerID="839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lxxpj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-lxxpj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliddd0c66d4d1", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:22.551942 containerd[1564]: 2025-09-04 17:51:22.516 [INFO][4574] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lxxpj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0" Sep 4 17:51:22.551942 containerd[1564]: 2025-09-04 17:51:22.516 [INFO][4574] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliddd0c66d4d1 ContainerID="839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lxxpj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0" Sep 4 17:51:22.551942 containerd[1564]: 2025-09-04 17:51:22.530 [INFO][4574] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lxxpj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0" Sep 4 17:51:22.551942 containerd[1564]: 2025-09-04 17:51:22.531 [INFO][4574] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lxxpj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b", Pod:"coredns-7c65d6cfc9-lxxpj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliddd0c66d4d1", MAC:"ee:7b:a1:cb:62:3e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:22.551942 containerd[1564]: 2025-09-04 17:51:22.545 [INFO][4574] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" Namespace="kube-system" Pod="coredns-7c65d6cfc9-lxxpj" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--lxxpj-eth0" Sep 4 17:51:22.566684 systemd[1]: Started cri-containerd-cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce.scope - libcontainer container cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce. Sep 4 17:51:22.589038 containerd[1564]: time="2025-09-04T17:51:22.588982278Z" level=info msg="connecting to shim 839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b" address="unix:///run/containerd/s/18f8c6c570f2d7eda61bdc243f16863d1dd33c0624b6f1d20ce9ff1f6de48b32" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:51:22.601399 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 17:51:22.619409 systemd-networkd[1492]: calia69e2bf0427: Link UP Sep 4 17:51:22.620274 systemd-networkd[1492]: calia69e2bf0427: Gained carrier Sep 4 17:51:22.625360 systemd[1]: Started cri-containerd-839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b.scope - libcontainer container 839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b. Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.351 [INFO][4594] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0 calico-apiserver-5bf46546c9- calico-apiserver 500d5572-b816-409e-8956-3b59c87319da 797 0 2025-09-04 17:50:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5bf46546c9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5bf46546c9-6pt6w eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia69e2bf0427 [] [] }} ContainerID="57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-6pt6w" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.352 [INFO][4594] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-6pt6w" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.424 [INFO][4630] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" HandleID="k8s-pod-network.57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" Workload="localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.424 [INFO][4630] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" HandleID="k8s-pod-network.57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" Workload="localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c66e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5bf46546c9-6pt6w", "timestamp":"2025-09-04 17:51:22.424756668 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.424 [INFO][4630] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.511 [INFO][4630] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.512 [INFO][4630] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.553 [INFO][4630] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" host="localhost" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.576 [INFO][4630] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.585 [INFO][4630] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.589 [INFO][4630] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.593 [INFO][4630] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.593 [INFO][4630] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" host="localhost" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.597 [INFO][4630] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938 Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.602 [INFO][4630] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" host="localhost" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.611 [INFO][4630] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" host="localhost" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.611 [INFO][4630] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" host="localhost" Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.611 [INFO][4630] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 4 17:51:22.639120 containerd[1564]: 2025-09-04 17:51:22.611 [INFO][4630] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" HandleID="k8s-pod-network.57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" Workload="localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0" Sep 4 17:51:22.639887 containerd[1564]: 2025-09-04 17:51:22.616 [INFO][4594] cni-plugin/k8s.go 418: Populated endpoint ContainerID="57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-6pt6w" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0", GenerateName:"calico-apiserver-5bf46546c9-", Namespace:"calico-apiserver", SelfLink:"", UID:"500d5572-b816-409e-8956-3b59c87319da", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5bf46546c9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5bf46546c9-6pt6w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia69e2bf0427", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:22.639887 containerd[1564]: 2025-09-04 17:51:22.616 [INFO][4594] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-6pt6w" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0" Sep 4 17:51:22.639887 containerd[1564]: 2025-09-04 17:51:22.616 [INFO][4594] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia69e2bf0427 ContainerID="57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-6pt6w" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0" Sep 4 17:51:22.639887 containerd[1564]: 2025-09-04 17:51:22.621 [INFO][4594] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-6pt6w" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0" Sep 4 17:51:22.639887 containerd[1564]: 2025-09-04 17:51:22.622 [INFO][4594] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-6pt6w" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0", GenerateName:"calico-apiserver-5bf46546c9-", Namespace:"calico-apiserver", SelfLink:"", UID:"500d5572-b816-409e-8956-3b59c87319da", ResourceVersion:"797", Generation:0, CreationTimestamp:time.Date(2025, time.September, 4, 17, 50, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5bf46546c9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938", Pod:"calico-apiserver-5bf46546c9-6pt6w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia69e2bf0427", MAC:"5a:31:dc:b8:7f:ce", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 4 17:51:22.639887 containerd[1564]: 2025-09-04 17:51:22.632 [INFO][4594] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" Namespace="calico-apiserver" Pod="calico-apiserver-5bf46546c9-6pt6w" WorkloadEndpoint="localhost-k8s-calico--apiserver--5bf46546c9--6pt6w-eth0" Sep 4 17:51:22.649845 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 17:51:22.691541 containerd[1564]: time="2025-09-04T17:51:22.691475491Z" level=info msg="connecting to shim 57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938" address="unix:///run/containerd/s/57fb507c7484a575fe4579e3a262ceb6bec49da138aac0aa8c0fcbaa22813fc6" namespace=k8s.io protocol=ttrpc version=3 Sep 4 17:51:22.699027 containerd[1564]: time="2025-09-04T17:51:22.698964955Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cfb9fdb56-hqdqm,Uid:76446953-bc18-4027-94fd-31764ac45951,Namespace:calico-system,Attempt:0,} returns sandbox id \"cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce\"" Sep 4 17:51:22.700547 containerd[1564]: time="2025-09-04T17:51:22.700499378Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-lxxpj,Uid:a9d9ac52-b01b-4920-9bfe-f3cca3aab3b1,Namespace:kube-system,Attempt:0,} returns sandbox id \"839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b\"" Sep 4 17:51:22.705487 containerd[1564]: time="2025-09-04T17:51:22.705431497Z" level=info msg="CreateContainer within sandbox \"839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 4 17:51:22.733635 systemd[1]: Started cri-containerd-57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938.scope - libcontainer container 57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938. Sep 4 17:51:22.749567 systemd-resolved[1411]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Sep 4 17:51:22.844691 containerd[1564]: time="2025-09-04T17:51:22.844635787Z" level=info msg="Container d1d7316525afae8ffcaf81e128412875fff2ff4ed4866bfb9e5013b1da164ec6: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:23.003221 containerd[1564]: time="2025-09-04T17:51:23.003167677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5bf46546c9-6pt6w,Uid:500d5572-b816-409e-8956-3b59c87319da,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938\"" Sep 4 17:51:23.016663 containerd[1564]: time="2025-09-04T17:51:23.016583866Z" level=info msg="CreateContainer within sandbox \"839b64bb2ecc51634cfe63b447c98d538b931b14abe3288aecc5e04fa132a00b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d1d7316525afae8ffcaf81e128412875fff2ff4ed4866bfb9e5013b1da164ec6\"" Sep 4 17:51:23.017561 containerd[1564]: time="2025-09-04T17:51:23.017519063Z" level=info msg="StartContainer for \"d1d7316525afae8ffcaf81e128412875fff2ff4ed4866bfb9e5013b1da164ec6\"" Sep 4 17:51:23.018885 containerd[1564]: time="2025-09-04T17:51:23.018785642Z" level=info msg="connecting to shim d1d7316525afae8ffcaf81e128412875fff2ff4ed4866bfb9e5013b1da164ec6" address="unix:///run/containerd/s/18f8c6c570f2d7eda61bdc243f16863d1dd33c0624b6f1d20ce9ff1f6de48b32" protocol=ttrpc version=3 Sep 4 17:51:23.049363 systemd[1]: Started cri-containerd-d1d7316525afae8ffcaf81e128412875fff2ff4ed4866bfb9e5013b1da164ec6.scope - libcontainer container d1d7316525afae8ffcaf81e128412875fff2ff4ed4866bfb9e5013b1da164ec6. Sep 4 17:51:23.106319 containerd[1564]: time="2025-09-04T17:51:23.105970299Z" level=info msg="StartContainer for \"d1d7316525afae8ffcaf81e128412875fff2ff4ed4866bfb9e5013b1da164ec6\" returns successfully" Sep 4 17:51:23.433864 kubelet[2704]: I0904 17:51:23.433411 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-lxxpj" podStartSLOduration=40.433382816 podStartE2EDuration="40.433382816s" podCreationTimestamp="2025-09-04 17:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-04 17:51:23.419800626 +0000 UTC m=+46.276964493" watchObservedRunningTime="2025-09-04 17:51:23.433382816 +0000 UTC m=+46.290546683" Sep 4 17:51:23.555373 systemd-networkd[1492]: calic7f4fd7e9c2: Gained IPv6LL Sep 4 17:51:23.576803 containerd[1564]: time="2025-09-04T17:51:23.576744067Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:23.577659 containerd[1564]: time="2025-09-04T17:51:23.577614723Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 4 17:51:23.579328 containerd[1564]: time="2025-09-04T17:51:23.579260986Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:23.581656 containerd[1564]: time="2025-09-04T17:51:23.581581677Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:23.582264 containerd[1564]: time="2025-09-04T17:51:23.582211690Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.760238621s" Sep 4 17:51:23.582264 containerd[1564]: time="2025-09-04T17:51:23.582252006Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 17:51:23.583304 containerd[1564]: time="2025-09-04T17:51:23.583269848Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 4 17:51:23.585644 containerd[1564]: time="2025-09-04T17:51:23.585610807Z" level=info msg="CreateContainer within sandbox \"c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 17:51:23.595294 containerd[1564]: time="2025-09-04T17:51:23.595210624Z" level=info msg="Container b445265864fa9cb009d118a490f8e9d75570c7e6f422f65acd72e02dcb69b79d: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:23.603475 containerd[1564]: time="2025-09-04T17:51:23.603433074Z" level=info msg="CreateContainer within sandbox \"c5dd2fd3f4bd39331fd0226460d4be8f9bf5d892d4696c6c8e02433079911848\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b445265864fa9cb009d118a490f8e9d75570c7e6f422f65acd72e02dcb69b79d\"" Sep 4 17:51:23.604044 containerd[1564]: time="2025-09-04T17:51:23.604019526Z" level=info msg="StartContainer for \"b445265864fa9cb009d118a490f8e9d75570c7e6f422f65acd72e02dcb69b79d\"" Sep 4 17:51:23.605034 containerd[1564]: time="2025-09-04T17:51:23.604994628Z" level=info msg="connecting to shim b445265864fa9cb009d118a490f8e9d75570c7e6f422f65acd72e02dcb69b79d" address="unix:///run/containerd/s/b3488823e21b132a4a23b7bf371876aff1b40f9e952809952426af2341ef93da" protocol=ttrpc version=3 Sep 4 17:51:23.644330 systemd[1]: Started cri-containerd-b445265864fa9cb009d118a490f8e9d75570c7e6f422f65acd72e02dcb69b79d.scope - libcontainer container b445265864fa9cb009d118a490f8e9d75570c7e6f422f65acd72e02dcb69b79d. Sep 4 17:51:23.806311 containerd[1564]: time="2025-09-04T17:51:23.806185068Z" level=info msg="StartContainer for \"b445265864fa9cb009d118a490f8e9d75570c7e6f422f65acd72e02dcb69b79d\" returns successfully" Sep 4 17:51:24.067367 systemd-networkd[1492]: calia69e2bf0427: Gained IPv6LL Sep 4 17:51:24.305648 systemd[1]: Started sshd@8-10.0.0.60:22-10.0.0.1:43732.service - OpenSSH per-connection server daemon (10.0.0.1:43732). Sep 4 17:51:24.388325 sshd[4904]: Accepted publickey for core from 10.0.0.1 port 43732 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:24.390822 sshd-session[4904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:24.395935 systemd-logind[1545]: New session 9 of user core. Sep 4 17:51:24.402590 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 4 17:51:24.452330 systemd-networkd[1492]: caliddd0c66d4d1: Gained IPv6LL Sep 4 17:51:24.710088 sshd[4907]: Connection closed by 10.0.0.1 port 43732 Sep 4 17:51:24.710424 sshd-session[4904]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:24.715316 systemd[1]: sshd@8-10.0.0.60:22-10.0.0.1:43732.service: Deactivated successfully. Sep 4 17:51:24.717553 systemd[1]: session-9.scope: Deactivated successfully. Sep 4 17:51:24.718493 systemd-logind[1545]: Session 9 logged out. Waiting for processes to exit. Sep 4 17:51:24.719850 systemd-logind[1545]: Removed session 9. Sep 4 17:51:24.785138 kubelet[2704]: I0904 17:51:24.785023 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5bf46546c9-zlqdb" podStartSLOduration=29.686585536 podStartE2EDuration="33.785004756s" podCreationTimestamp="2025-09-04 17:50:51 +0000 UTC" firstStartedPulling="2025-09-04 17:51:19.484629182 +0000 UTC m=+42.341793039" lastFinishedPulling="2025-09-04 17:51:23.583048402 +0000 UTC m=+46.440212259" observedRunningTime="2025-09-04 17:51:24.422881003 +0000 UTC m=+47.280044860" watchObservedRunningTime="2025-09-04 17:51:24.785004756 +0000 UTC m=+47.642168613" Sep 4 17:51:25.569707 containerd[1564]: time="2025-09-04T17:51:25.569639460Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:25.570432 containerd[1564]: time="2025-09-04T17:51:25.570388157Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 4 17:51:25.571576 containerd[1564]: time="2025-09-04T17:51:25.571543677Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:25.573479 containerd[1564]: time="2025-09-04T17:51:25.573447574Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:25.573989 containerd[1564]: time="2025-09-04T17:51:25.573949587Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.990647939s" Sep 4 17:51:25.574022 containerd[1564]: time="2025-09-04T17:51:25.573992327Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 4 17:51:25.574946 containerd[1564]: time="2025-09-04T17:51:25.574921663Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 4 17:51:25.576379 containerd[1564]: time="2025-09-04T17:51:25.576330079Z" level=info msg="CreateContainer within sandbox \"92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 4 17:51:25.593731 containerd[1564]: time="2025-09-04T17:51:25.593671016Z" level=info msg="Container 03f38c2e1a862c6202ff654227aac6d150f32be724c757a3a3ff971b5b49b55d: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:25.611988 containerd[1564]: time="2025-09-04T17:51:25.611934134Z" level=info msg="CreateContainer within sandbox \"92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"03f38c2e1a862c6202ff654227aac6d150f32be724c757a3a3ff971b5b49b55d\"" Sep 4 17:51:25.612635 containerd[1564]: time="2025-09-04T17:51:25.612533350Z" level=info msg="StartContainer for \"03f38c2e1a862c6202ff654227aac6d150f32be724c757a3a3ff971b5b49b55d\"" Sep 4 17:51:25.614008 containerd[1564]: time="2025-09-04T17:51:25.613979045Z" level=info msg="connecting to shim 03f38c2e1a862c6202ff654227aac6d150f32be724c757a3a3ff971b5b49b55d" address="unix:///run/containerd/s/0f8a0e83d16eb286758ead8e9d75040a6e80cc25c36cfdefd795a1ba1a9b02b1" protocol=ttrpc version=3 Sep 4 17:51:25.638368 systemd[1]: Started cri-containerd-03f38c2e1a862c6202ff654227aac6d150f32be724c757a3a3ff971b5b49b55d.scope - libcontainer container 03f38c2e1a862c6202ff654227aac6d150f32be724c757a3a3ff971b5b49b55d. Sep 4 17:51:25.682168 containerd[1564]: time="2025-09-04T17:51:25.682123299Z" level=info msg="StartContainer for \"03f38c2e1a862c6202ff654227aac6d150f32be724c757a3a3ff971b5b49b55d\" returns successfully" Sep 4 17:51:28.318277 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1350084835.mount: Deactivated successfully. Sep 4 17:51:29.090374 containerd[1564]: time="2025-09-04T17:51:29.090295167Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:29.132912 containerd[1564]: time="2025-09-04T17:51:29.132829153Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 4 17:51:29.140743 containerd[1564]: time="2025-09-04T17:51:29.139731536Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:29.142826 containerd[1564]: time="2025-09-04T17:51:29.142792513Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:29.143439 containerd[1564]: time="2025-09-04T17:51:29.143406537Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.568461539s" Sep 4 17:51:29.143489 containerd[1564]: time="2025-09-04T17:51:29.143454437Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 4 17:51:29.144789 containerd[1564]: time="2025-09-04T17:51:29.144744459Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 4 17:51:29.146893 containerd[1564]: time="2025-09-04T17:51:29.146856735Z" level=info msg="CreateContainer within sandbox \"8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 4 17:51:29.156158 containerd[1564]: time="2025-09-04T17:51:29.155358451Z" level=info msg="Container 3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:29.164299 containerd[1564]: time="2025-09-04T17:51:29.164257213Z" level=info msg="CreateContainer within sandbox \"8d8861bc1c2c3cbf14b8652e154a9e62c2942393d9238286d58b904092a89e3c\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86\"" Sep 4 17:51:29.164775 containerd[1564]: time="2025-09-04T17:51:29.164710394Z" level=info msg="StartContainer for \"3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86\"" Sep 4 17:51:29.165952 containerd[1564]: time="2025-09-04T17:51:29.165914324Z" level=info msg="connecting to shim 3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86" address="unix:///run/containerd/s/f144fbd95990670237aa1682cfc0e118408a066b64c0500c31d08b37ce2f585e" protocol=ttrpc version=3 Sep 4 17:51:29.224284 systemd[1]: Started cri-containerd-3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86.scope - libcontainer container 3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86. Sep 4 17:51:29.275788 containerd[1564]: time="2025-09-04T17:51:29.275736762Z" level=info msg="StartContainer for \"3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86\" returns successfully" Sep 4 17:51:29.435443 kubelet[2704]: I0904 17:51:29.435370 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-7f6s9" podStartSLOduration=28.022347862 podStartE2EDuration="36.43532683s" podCreationTimestamp="2025-09-04 17:50:53 +0000 UTC" firstStartedPulling="2025-09-04 17:51:20.731556428 +0000 UTC m=+43.588720285" lastFinishedPulling="2025-09-04 17:51:29.144535406 +0000 UTC m=+52.001699253" observedRunningTime="2025-09-04 17:51:29.434762981 +0000 UTC m=+52.291926838" watchObservedRunningTime="2025-09-04 17:51:29.43532683 +0000 UTC m=+52.292490687" Sep 4 17:51:29.526755 containerd[1564]: time="2025-09-04T17:51:29.526690163Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86\" id:\"72e98fb14fd095cdb2215823181c663ab9b8e760cef6c6b7c83456974338caa3\" pid:5030 exit_status:1 exited_at:{seconds:1757008289 nanos:526050201}" Sep 4 17:51:29.728023 systemd[1]: Started sshd@9-10.0.0.60:22-10.0.0.1:43734.service - OpenSSH per-connection server daemon (10.0.0.1:43734). Sep 4 17:51:29.800227 sshd[5044]: Accepted publickey for core from 10.0.0.1 port 43734 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:29.802290 sshd-session[5044]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:29.806841 systemd-logind[1545]: New session 10 of user core. Sep 4 17:51:29.821240 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 4 17:51:30.277960 sshd[5048]: Connection closed by 10.0.0.1 port 43734 Sep 4 17:51:30.279353 sshd-session[5044]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:30.289855 systemd[1]: sshd@9-10.0.0.60:22-10.0.0.1:43734.service: Deactivated successfully. Sep 4 17:51:30.292143 systemd[1]: session-10.scope: Deactivated successfully. Sep 4 17:51:30.292897 systemd-logind[1545]: Session 10 logged out. Waiting for processes to exit. Sep 4 17:51:30.295825 systemd[1]: Started sshd@10-10.0.0.60:22-10.0.0.1:34098.service - OpenSSH per-connection server daemon (10.0.0.1:34098). Sep 4 17:51:30.297346 systemd-logind[1545]: Removed session 10. Sep 4 17:51:30.350576 sshd[5080]: Accepted publickey for core from 10.0.0.1 port 34098 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:30.352406 sshd-session[5080]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:30.356844 systemd-logind[1545]: New session 11 of user core. Sep 4 17:51:30.366250 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 4 17:51:30.521601 containerd[1564]: time="2025-09-04T17:51:30.521443119Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86\" id:\"cced3db2be28e022f552d8069e785bd4a6e52f90bb3c6f80436d95f8228f1dc8\" pid:5103 exited_at:{seconds:1757008290 nanos:521057785}" Sep 4 17:51:30.527140 sshd[5083]: Connection closed by 10.0.0.1 port 34098 Sep 4 17:51:30.529045 sshd-session[5080]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:30.538936 systemd[1]: sshd@10-10.0.0.60:22-10.0.0.1:34098.service: Deactivated successfully. Sep 4 17:51:30.543765 systemd[1]: session-11.scope: Deactivated successfully. Sep 4 17:51:30.546163 systemd-logind[1545]: Session 11 logged out. Waiting for processes to exit. Sep 4 17:51:30.551778 systemd-logind[1545]: Removed session 11. Sep 4 17:51:30.555176 systemd[1]: Started sshd@11-10.0.0.60:22-10.0.0.1:34110.service - OpenSSH per-connection server daemon (10.0.0.1:34110). Sep 4 17:51:30.608187 sshd[5119]: Accepted publickey for core from 10.0.0.1 port 34110 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:30.609843 sshd-session[5119]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:30.615185 systemd-logind[1545]: New session 12 of user core. Sep 4 17:51:30.626265 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 4 17:51:30.739940 sshd[5123]: Connection closed by 10.0.0.1 port 34110 Sep 4 17:51:30.740434 sshd-session[5119]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:30.745423 systemd[1]: sshd@11-10.0.0.60:22-10.0.0.1:34110.service: Deactivated successfully. Sep 4 17:51:30.748020 systemd[1]: session-12.scope: Deactivated successfully. Sep 4 17:51:30.749132 systemd-logind[1545]: Session 12 logged out. Waiting for processes to exit. Sep 4 17:51:30.750861 systemd-logind[1545]: Removed session 12. Sep 4 17:51:33.187319 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount462369755.mount: Deactivated successfully. Sep 4 17:51:33.863987 containerd[1564]: time="2025-09-04T17:51:33.863912119Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:33.864834 containerd[1564]: time="2025-09-04T17:51:33.864782383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 4 17:51:33.866273 containerd[1564]: time="2025-09-04T17:51:33.866214481Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:33.868575 containerd[1564]: time="2025-09-04T17:51:33.868537232Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:33.875571 containerd[1564]: time="2025-09-04T17:51:33.875531894Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 4.730719417s" Sep 4 17:51:33.875571 containerd[1564]: time="2025-09-04T17:51:33.875566529Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 4 17:51:33.876634 containerd[1564]: time="2025-09-04T17:51:33.876552721Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 4 17:51:33.885643 containerd[1564]: time="2025-09-04T17:51:33.885571593Z" level=info msg="CreateContainer within sandbox \"5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 4 17:51:33.896920 containerd[1564]: time="2025-09-04T17:51:33.896864344Z" level=info msg="Container 8e3e2c8f9be4f1e355c403f7f0c7a08a6000410f29d176735c0bb9d208a1f36d: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:33.905550 containerd[1564]: time="2025-09-04T17:51:33.905498714Z" level=info msg="CreateContainer within sandbox \"5eb3b6f274cebac775de07fb077367816449cfaf3d9f3e9eacac37fd720d36b6\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"8e3e2c8f9be4f1e355c403f7f0c7a08a6000410f29d176735c0bb9d208a1f36d\"" Sep 4 17:51:33.906134 containerd[1564]: time="2025-09-04T17:51:33.906075838Z" level=info msg="StartContainer for \"8e3e2c8f9be4f1e355c403f7f0c7a08a6000410f29d176735c0bb9d208a1f36d\"" Sep 4 17:51:33.907378 containerd[1564]: time="2025-09-04T17:51:33.907353376Z" level=info msg="connecting to shim 8e3e2c8f9be4f1e355c403f7f0c7a08a6000410f29d176735c0bb9d208a1f36d" address="unix:///run/containerd/s/81a80aea8cc437347d2ae9a0c20e2731bfafcb35c29e9ecadab326960d7f8bc5" protocol=ttrpc version=3 Sep 4 17:51:33.936263 systemd[1]: Started cri-containerd-8e3e2c8f9be4f1e355c403f7f0c7a08a6000410f29d176735c0bb9d208a1f36d.scope - libcontainer container 8e3e2c8f9be4f1e355c403f7f0c7a08a6000410f29d176735c0bb9d208a1f36d. Sep 4 17:51:33.993126 containerd[1564]: time="2025-09-04T17:51:33.993040056Z" level=info msg="StartContainer for \"8e3e2c8f9be4f1e355c403f7f0c7a08a6000410f29d176735c0bb9d208a1f36d\" returns successfully" Sep 4 17:51:34.450305 kubelet[2704]: I0904 17:51:34.450088 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-54f456cb4-6p4xx" podStartSLOduration=2.511109163 podStartE2EDuration="17.450069525s" podCreationTimestamp="2025-09-04 17:51:17 +0000 UTC" firstStartedPulling="2025-09-04 17:51:18.937490256 +0000 UTC m=+41.794654113" lastFinishedPulling="2025-09-04 17:51:33.876450618 +0000 UTC m=+56.733614475" observedRunningTime="2025-09-04 17:51:34.44873538 +0000 UTC m=+57.305899237" watchObservedRunningTime="2025-09-04 17:51:34.450069525 +0000 UTC m=+57.307233382" Sep 4 17:51:35.757128 systemd[1]: Started sshd@12-10.0.0.60:22-10.0.0.1:34124.service - OpenSSH per-connection server daemon (10.0.0.1:34124). Sep 4 17:51:35.829917 sshd[5183]: Accepted publickey for core from 10.0.0.1 port 34124 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:35.831919 sshd-session[5183]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:35.836757 systemd-logind[1545]: New session 13 of user core. Sep 4 17:51:35.843293 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 4 17:51:35.999754 sshd[5186]: Connection closed by 10.0.0.1 port 34124 Sep 4 17:51:36.001687 sshd-session[5183]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:36.005974 systemd[1]: sshd@12-10.0.0.60:22-10.0.0.1:34124.service: Deactivated successfully. Sep 4 17:51:36.008608 systemd[1]: session-13.scope: Deactivated successfully. Sep 4 17:51:36.009718 systemd-logind[1545]: Session 13 logged out. Waiting for processes to exit. Sep 4 17:51:36.011073 systemd-logind[1545]: Removed session 13. Sep 4 17:51:37.784061 containerd[1564]: time="2025-09-04T17:51:37.783996074Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86\" id:\"efbc18f7fa2d1484801014b69ed9e4df47316b0cf80a2c23e4f82b4532d6f7df\" pid:5220 exited_at:{seconds:1757008297 nanos:783607925}" Sep 4 17:51:37.999261 containerd[1564]: time="2025-09-04T17:51:37.999193276Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:37.999992 containerd[1564]: time="2025-09-04T17:51:37.999913347Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 4 17:51:38.001212 containerd[1564]: time="2025-09-04T17:51:38.001171379Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:38.003171 containerd[1564]: time="2025-09-04T17:51:38.003140154Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:38.003700 containerd[1564]: time="2025-09-04T17:51:38.003642026Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 4.127042478s" Sep 4 17:51:38.003700 containerd[1564]: time="2025-09-04T17:51:38.003685047Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 4 17:51:38.004743 containerd[1564]: time="2025-09-04T17:51:38.004720310Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 4 17:51:38.016223 containerd[1564]: time="2025-09-04T17:51:38.015877121Z" level=info msg="CreateContainer within sandbox \"cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 4 17:51:38.028416 containerd[1564]: time="2025-09-04T17:51:38.028361443Z" level=info msg="Container edfbd24146737f952467407463d734d8d5212225cdd073910085c4d2983e9a1e: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:38.037687 containerd[1564]: time="2025-09-04T17:51:38.037586538Z" level=info msg="CreateContainer within sandbox \"cc2db3119a36a79b6f838ff137aef9cc52de942210c2d1fafb0b89d42e28a6ce\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"edfbd24146737f952467407463d734d8d5212225cdd073910085c4d2983e9a1e\"" Sep 4 17:51:38.038311 containerd[1564]: time="2025-09-04T17:51:38.038272095Z" level=info msg="StartContainer for \"edfbd24146737f952467407463d734d8d5212225cdd073910085c4d2983e9a1e\"" Sep 4 17:51:38.039280 containerd[1564]: time="2025-09-04T17:51:38.039254619Z" level=info msg="connecting to shim edfbd24146737f952467407463d734d8d5212225cdd073910085c4d2983e9a1e" address="unix:///run/containerd/s/73eda9a19a9e56dfca64f9a77e4f920d0bbea593b8bb60ded6add4186547109d" protocol=ttrpc version=3 Sep 4 17:51:38.065255 systemd[1]: Started cri-containerd-edfbd24146737f952467407463d734d8d5212225cdd073910085c4d2983e9a1e.scope - libcontainer container edfbd24146737f952467407463d734d8d5212225cdd073910085c4d2983e9a1e. Sep 4 17:51:38.111245 containerd[1564]: time="2025-09-04T17:51:38.111204025Z" level=info msg="StartContainer for \"edfbd24146737f952467407463d734d8d5212225cdd073910085c4d2983e9a1e\" returns successfully" Sep 4 17:51:38.463008 kubelet[2704]: I0904 17:51:38.462653 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5cfb9fdb56-hqdqm" podStartSLOduration=29.158768116 podStartE2EDuration="44.462635203s" podCreationTimestamp="2025-09-04 17:50:54 +0000 UTC" firstStartedPulling="2025-09-04 17:51:22.70071837 +0000 UTC m=+45.557882227" lastFinishedPulling="2025-09-04 17:51:38.004585467 +0000 UTC m=+60.861749314" observedRunningTime="2025-09-04 17:51:38.461563922 +0000 UTC m=+61.318727779" watchObservedRunningTime="2025-09-04 17:51:38.462635203 +0000 UTC m=+61.319799060" Sep 4 17:51:38.497770 containerd[1564]: time="2025-09-04T17:51:38.497725415Z" level=info msg="TaskExit event in podsandbox handler container_id:\"edfbd24146737f952467407463d734d8d5212225cdd073910085c4d2983e9a1e\" id:\"674402a7eaa9610a02009994777a3469cf338efb8ba4f82b10456e30b4edca0f\" pid:5288 exited_at:{seconds:1757008298 nanos:497515722}" Sep 4 17:51:38.678612 containerd[1564]: time="2025-09-04T17:51:38.678548371Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:38.684209 containerd[1564]: time="2025-09-04T17:51:38.684147891Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 4 17:51:38.685712 containerd[1564]: time="2025-09-04T17:51:38.685685026Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 680.938827ms" Sep 4 17:51:38.685786 containerd[1564]: time="2025-09-04T17:51:38.685714221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 4 17:51:38.687738 containerd[1564]: time="2025-09-04T17:51:38.687498039Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 4 17:51:38.688370 containerd[1564]: time="2025-09-04T17:51:38.688335581Z" level=info msg="CreateContainer within sandbox \"57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 4 17:51:38.699471 containerd[1564]: time="2025-09-04T17:51:38.699431367Z" level=info msg="Container 2cbbd702076181ff90a444a35ab24b29f97b76ef3a6de5bc29a818746eca89d6: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:38.707639 containerd[1564]: time="2025-09-04T17:51:38.707586785Z" level=info msg="CreateContainer within sandbox \"57ddf233580b1046cdba0348fdabaf591ccad941b47632d70b4b571fda435938\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2cbbd702076181ff90a444a35ab24b29f97b76ef3a6de5bc29a818746eca89d6\"" Sep 4 17:51:38.708125 containerd[1564]: time="2025-09-04T17:51:38.708059792Z" level=info msg="StartContainer for \"2cbbd702076181ff90a444a35ab24b29f97b76ef3a6de5bc29a818746eca89d6\"" Sep 4 17:51:38.712799 containerd[1564]: time="2025-09-04T17:51:38.712745357Z" level=info msg="connecting to shim 2cbbd702076181ff90a444a35ab24b29f97b76ef3a6de5bc29a818746eca89d6" address="unix:///run/containerd/s/57fb507c7484a575fe4579e3a262ceb6bec49da138aac0aa8c0fcbaa22813fc6" protocol=ttrpc version=3 Sep 4 17:51:38.740349 systemd[1]: Started cri-containerd-2cbbd702076181ff90a444a35ab24b29f97b76ef3a6de5bc29a818746eca89d6.scope - libcontainer container 2cbbd702076181ff90a444a35ab24b29f97b76ef3a6de5bc29a818746eca89d6. Sep 4 17:51:38.788410 containerd[1564]: time="2025-09-04T17:51:38.788359250Z" level=info msg="StartContainer for \"2cbbd702076181ff90a444a35ab24b29f97b76ef3a6de5bc29a818746eca89d6\" returns successfully" Sep 4 17:51:40.257905 kubelet[2704]: I0904 17:51:40.257825 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5bf46546c9-6pt6w" podStartSLOduration=33.575838908 podStartE2EDuration="49.257803958s" podCreationTimestamp="2025-09-04 17:50:51 +0000 UTC" firstStartedPulling="2025-09-04 17:51:23.004568067 +0000 UTC m=+45.861731925" lastFinishedPulling="2025-09-04 17:51:38.686533118 +0000 UTC m=+61.543696975" observedRunningTime="2025-09-04 17:51:39.466659709 +0000 UTC m=+62.323823556" watchObservedRunningTime="2025-09-04 17:51:40.257803958 +0000 UTC m=+63.114967815" Sep 4 17:51:40.938804 containerd[1564]: time="2025-09-04T17:51:40.938740389Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:40.939518 containerd[1564]: time="2025-09-04T17:51:40.939472673Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 4 17:51:40.940666 containerd[1564]: time="2025-09-04T17:51:40.940635575Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:40.942546 containerd[1564]: time="2025-09-04T17:51:40.942511456Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 4 17:51:40.943166 containerd[1564]: time="2025-09-04T17:51:40.943139846Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.255554733s" Sep 4 17:51:40.943207 containerd[1564]: time="2025-09-04T17:51:40.943171034Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 4 17:51:40.945377 containerd[1564]: time="2025-09-04T17:51:40.945347909Z" level=info msg="CreateContainer within sandbox \"92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 4 17:51:40.952874 containerd[1564]: time="2025-09-04T17:51:40.952833338Z" level=info msg="Container 5f8ec517deb9b83c2d67e32b2c8eea1d6de6ca2fd8d63a86670b30ec835767b1: CDI devices from CRI Config.CDIDevices: []" Sep 4 17:51:40.963643 containerd[1564]: time="2025-09-04T17:51:40.963591219Z" level=info msg="CreateContainer within sandbox \"92d24b539a79a20782822bb2cea16c9c76bd45d1accd2bfe6b50b1c563d967f9\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"5f8ec517deb9b83c2d67e32b2c8eea1d6de6ca2fd8d63a86670b30ec835767b1\"" Sep 4 17:51:40.964118 containerd[1564]: time="2025-09-04T17:51:40.964073633Z" level=info msg="StartContainer for \"5f8ec517deb9b83c2d67e32b2c8eea1d6de6ca2fd8d63a86670b30ec835767b1\"" Sep 4 17:51:40.966730 containerd[1564]: time="2025-09-04T17:51:40.966399168Z" level=info msg="connecting to shim 5f8ec517deb9b83c2d67e32b2c8eea1d6de6ca2fd8d63a86670b30ec835767b1" address="unix:///run/containerd/s/0f8a0e83d16eb286758ead8e9d75040a6e80cc25c36cfdefd795a1ba1a9b02b1" protocol=ttrpc version=3 Sep 4 17:51:40.993250 systemd[1]: Started cri-containerd-5f8ec517deb9b83c2d67e32b2c8eea1d6de6ca2fd8d63a86670b30ec835767b1.scope - libcontainer container 5f8ec517deb9b83c2d67e32b2c8eea1d6de6ca2fd8d63a86670b30ec835767b1. Sep 4 17:51:41.014794 systemd[1]: Started sshd@13-10.0.0.60:22-10.0.0.1:40340.service - OpenSSH per-connection server daemon (10.0.0.1:40340). Sep 4 17:51:41.041442 containerd[1564]: time="2025-09-04T17:51:41.040831570Z" level=info msg="StartContainer for \"5f8ec517deb9b83c2d67e32b2c8eea1d6de6ca2fd8d63a86670b30ec835767b1\" returns successfully" Sep 4 17:51:41.076998 sshd[5368]: Accepted publickey for core from 10.0.0.1 port 40340 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:41.079214 sshd-session[5368]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:41.083666 systemd-logind[1545]: New session 14 of user core. Sep 4 17:51:41.091351 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 4 17:51:41.250743 sshd[5384]: Connection closed by 10.0.0.1 port 40340 Sep 4 17:51:41.250980 sshd-session[5368]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:41.256543 systemd[1]: sshd@13-10.0.0.60:22-10.0.0.1:40340.service: Deactivated successfully. Sep 4 17:51:41.258676 systemd[1]: session-14.scope: Deactivated successfully. Sep 4 17:51:41.259531 systemd-logind[1545]: Session 14 logged out. Waiting for processes to exit. Sep 4 17:51:41.260662 systemd-logind[1545]: Removed session 14. Sep 4 17:51:41.313193 kubelet[2704]: I0904 17:51:41.313152 2704 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 4 17:51:41.313667 kubelet[2704]: I0904 17:51:41.313206 2704 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 4 17:51:41.473832 kubelet[2704]: I0904 17:51:41.473773 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-8mncz" podStartSLOduration=27.013127117 podStartE2EDuration="47.473754842s" podCreationTimestamp="2025-09-04 17:50:54 +0000 UTC" firstStartedPulling="2025-09-04 17:51:20.483265024 +0000 UTC m=+43.340428881" lastFinishedPulling="2025-09-04 17:51:40.943892749 +0000 UTC m=+63.801056606" observedRunningTime="2025-09-04 17:51:41.472451265 +0000 UTC m=+64.329615122" watchObservedRunningTime="2025-09-04 17:51:41.473754842 +0000 UTC m=+64.330918699" Sep 4 17:51:43.708396 containerd[1564]: time="2025-09-04T17:51:43.708337016Z" level=info msg="TaskExit event in podsandbox handler container_id:\"185689874e428c9987e558bee86d514a724b8f5f5f4674f5cc23d74e0fa1b3bf\" id:\"5cd4673e003b87073242c1c1e854f8b39b35dcc3047dff10dad5a2307de0322d\" pid:5411 exited_at:{seconds:1757008303 nanos:707793863}" Sep 4 17:51:46.269157 systemd[1]: Started sshd@14-10.0.0.60:22-10.0.0.1:40354.service - OpenSSH per-connection server daemon (10.0.0.1:40354). Sep 4 17:51:46.438626 sshd[5427]: Accepted publickey for core from 10.0.0.1 port 40354 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:46.440221 sshd-session[5427]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:46.445066 systemd-logind[1545]: New session 15 of user core. Sep 4 17:51:46.457243 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 4 17:51:46.603597 sshd[5430]: Connection closed by 10.0.0.1 port 40354 Sep 4 17:51:46.603990 sshd-session[5427]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:46.608025 systemd[1]: sshd@14-10.0.0.60:22-10.0.0.1:40354.service: Deactivated successfully. Sep 4 17:51:46.610270 systemd[1]: session-15.scope: Deactivated successfully. Sep 4 17:51:46.611019 systemd-logind[1545]: Session 15 logged out. Waiting for processes to exit. Sep 4 17:51:46.612360 systemd-logind[1545]: Removed session 15. Sep 4 17:51:51.620695 systemd[1]: Started sshd@15-10.0.0.60:22-10.0.0.1:38858.service - OpenSSH per-connection server daemon (10.0.0.1:38858). Sep 4 17:51:51.679268 sshd[5445]: Accepted publickey for core from 10.0.0.1 port 38858 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:51.680758 sshd-session[5445]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:51.686201 systemd-logind[1545]: New session 16 of user core. Sep 4 17:51:51.695302 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 4 17:51:51.806517 sshd[5448]: Connection closed by 10.0.0.1 port 38858 Sep 4 17:51:51.806865 sshd-session[5445]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:51.815588 systemd[1]: sshd@15-10.0.0.60:22-10.0.0.1:38858.service: Deactivated successfully. Sep 4 17:51:51.817486 systemd[1]: session-16.scope: Deactivated successfully. Sep 4 17:51:51.818354 systemd-logind[1545]: Session 16 logged out. Waiting for processes to exit. Sep 4 17:51:51.821369 systemd[1]: Started sshd@16-10.0.0.60:22-10.0.0.1:38868.service - OpenSSH per-connection server daemon (10.0.0.1:38868). Sep 4 17:51:51.822042 systemd-logind[1545]: Removed session 16. Sep 4 17:51:51.874343 sshd[5461]: Accepted publickey for core from 10.0.0.1 port 38868 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:51.875993 sshd-session[5461]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:51.880508 systemd-logind[1545]: New session 17 of user core. Sep 4 17:51:51.890243 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 4 17:51:52.172729 sshd[5464]: Connection closed by 10.0.0.1 port 38868 Sep 4 17:51:52.173033 sshd-session[5461]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:52.182869 systemd[1]: sshd@16-10.0.0.60:22-10.0.0.1:38868.service: Deactivated successfully. Sep 4 17:51:52.184914 systemd[1]: session-17.scope: Deactivated successfully. Sep 4 17:51:52.185832 systemd-logind[1545]: Session 17 logged out. Waiting for processes to exit. Sep 4 17:51:52.189135 systemd[1]: Started sshd@17-10.0.0.60:22-10.0.0.1:38870.service - OpenSSH per-connection server daemon (10.0.0.1:38870). Sep 4 17:51:52.189804 systemd-logind[1545]: Removed session 17. Sep 4 17:51:52.248460 sshd[5475]: Accepted publickey for core from 10.0.0.1 port 38870 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:52.250832 sshd-session[5475]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:52.256043 systemd-logind[1545]: New session 18 of user core. Sep 4 17:51:52.261399 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 4 17:51:54.144171 sshd[5478]: Connection closed by 10.0.0.1 port 38870 Sep 4 17:51:54.145722 sshd-session[5475]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:54.159124 systemd[1]: sshd@17-10.0.0.60:22-10.0.0.1:38870.service: Deactivated successfully. Sep 4 17:51:54.165632 systemd[1]: session-18.scope: Deactivated successfully. Sep 4 17:51:54.165906 systemd[1]: session-18.scope: Consumed 598ms CPU time, 90.2M memory peak. Sep 4 17:51:54.169172 systemd-logind[1545]: Session 18 logged out. Waiting for processes to exit. Sep 4 17:51:54.170991 systemd[1]: Started sshd@18-10.0.0.60:22-10.0.0.1:38872.service - OpenSSH per-connection server daemon (10.0.0.1:38872). Sep 4 17:51:54.174436 systemd-logind[1545]: Removed session 18. Sep 4 17:51:54.233949 sshd[5498]: Accepted publickey for core from 10.0.0.1 port 38872 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:54.236041 sshd-session[5498]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:54.241081 systemd-logind[1545]: New session 19 of user core. Sep 4 17:51:54.254355 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 4 17:51:54.634952 sshd[5501]: Connection closed by 10.0.0.1 port 38872 Sep 4 17:51:54.636306 sshd-session[5498]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:54.645875 systemd[1]: sshd@18-10.0.0.60:22-10.0.0.1:38872.service: Deactivated successfully. Sep 4 17:51:54.648356 systemd[1]: session-19.scope: Deactivated successfully. Sep 4 17:51:54.649354 systemd-logind[1545]: Session 19 logged out. Waiting for processes to exit. Sep 4 17:51:54.652732 systemd[1]: Started sshd@19-10.0.0.60:22-10.0.0.1:38874.service - OpenSSH per-connection server daemon (10.0.0.1:38874). Sep 4 17:51:54.653731 systemd-logind[1545]: Removed session 19. Sep 4 17:51:54.712912 sshd[5512]: Accepted publickey for core from 10.0.0.1 port 38874 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:54.714252 sshd-session[5512]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:54.718920 systemd-logind[1545]: New session 20 of user core. Sep 4 17:51:54.728248 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 4 17:51:54.849862 sshd[5515]: Connection closed by 10.0.0.1 port 38874 Sep 4 17:51:54.850326 sshd-session[5512]: pam_unix(sshd:session): session closed for user core Sep 4 17:51:54.855661 systemd[1]: sshd@19-10.0.0.60:22-10.0.0.1:38874.service: Deactivated successfully. Sep 4 17:51:54.858139 systemd[1]: session-20.scope: Deactivated successfully. Sep 4 17:51:54.859372 systemd-logind[1545]: Session 20 logged out. Waiting for processes to exit. Sep 4 17:51:54.861067 systemd-logind[1545]: Removed session 20. Sep 4 17:51:59.862188 systemd[1]: Started sshd@20-10.0.0.60:22-10.0.0.1:38888.service - OpenSSH per-connection server daemon (10.0.0.1:38888). Sep 4 17:51:59.906252 sshd[5537]: Accepted publickey for core from 10.0.0.1 port 38888 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:51:59.907612 sshd-session[5537]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:51:59.911989 systemd-logind[1545]: New session 21 of user core. Sep 4 17:51:59.918232 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 4 17:52:00.034161 sshd[5540]: Connection closed by 10.0.0.1 port 38888 Sep 4 17:52:00.034554 sshd-session[5537]: pam_unix(sshd:session): session closed for user core Sep 4 17:52:00.039699 systemd[1]: sshd@20-10.0.0.60:22-10.0.0.1:38888.service: Deactivated successfully. Sep 4 17:52:00.041743 systemd[1]: session-21.scope: Deactivated successfully. Sep 4 17:52:00.042549 systemd-logind[1545]: Session 21 logged out. Waiting for processes to exit. Sep 4 17:52:00.043812 systemd-logind[1545]: Removed session 21. Sep 4 17:52:05.051903 systemd[1]: Started sshd@21-10.0.0.60:22-10.0.0.1:45564.service - OpenSSH per-connection server daemon (10.0.0.1:45564). Sep 4 17:52:05.109362 sshd[5555]: Accepted publickey for core from 10.0.0.1 port 45564 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:52:05.110885 sshd-session[5555]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:05.115344 systemd-logind[1545]: New session 22 of user core. Sep 4 17:52:05.131254 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 4 17:52:05.252771 sshd[5558]: Connection closed by 10.0.0.1 port 45564 Sep 4 17:52:05.253420 sshd-session[5555]: pam_unix(sshd:session): session closed for user core Sep 4 17:52:05.257546 systemd[1]: sshd@21-10.0.0.60:22-10.0.0.1:45564.service: Deactivated successfully. Sep 4 17:52:05.259536 systemd[1]: session-22.scope: Deactivated successfully. Sep 4 17:52:05.260277 systemd-logind[1545]: Session 22 logged out. Waiting for processes to exit. Sep 4 17:52:05.261535 systemd-logind[1545]: Removed session 22. Sep 4 17:52:07.737969 containerd[1564]: time="2025-09-04T17:52:07.737575496Z" level=info msg="TaskExit event in podsandbox handler container_id:\"edfbd24146737f952467407463d734d8d5212225cdd073910085c4d2983e9a1e\" id:\"d24da8432e2b2cee76364e7f0072db9c6648d7392fab3948664e55983ac4f6fe\" pid:5583 exited_at:{seconds:1757008327 nanos:737149653}" Sep 4 17:52:07.805740 containerd[1564]: time="2025-09-04T17:52:07.805653414Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86\" id:\"59db77b729db3b4591ded7d4eac7032fe7b180571dbb41198595189b5a923873\" pid:5606 exited_at:{seconds:1757008327 nanos:805278889}" Sep 4 17:52:08.291454 containerd[1564]: time="2025-09-04T17:52:08.291347438Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3860f64f4727a8203f018f667825a59270cd70b51eaf3bb87679e77646fefb86\" id:\"dc83164d0665d224ca5ac51bc72deaf3d187890eebb1e892fcf82724492eb100\" pid:5629 exited_at:{seconds:1757008328 nanos:290882482}" Sep 4 17:52:10.266538 systemd[1]: Started sshd@22-10.0.0.60:22-10.0.0.1:45202.service - OpenSSH per-connection server daemon (10.0.0.1:45202). Sep 4 17:52:10.343158 sshd[5642]: Accepted publickey for core from 10.0.0.1 port 45202 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:52:10.344921 sshd-session[5642]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:10.350654 systemd-logind[1545]: New session 23 of user core. Sep 4 17:52:10.357319 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 4 17:52:10.502606 sshd[5645]: Connection closed by 10.0.0.1 port 45202 Sep 4 17:52:10.502970 sshd-session[5642]: pam_unix(sshd:session): session closed for user core Sep 4 17:52:10.508185 systemd[1]: sshd@22-10.0.0.60:22-10.0.0.1:45202.service: Deactivated successfully. Sep 4 17:52:10.510497 systemd[1]: session-23.scope: Deactivated successfully. Sep 4 17:52:10.511565 systemd-logind[1545]: Session 23 logged out. Waiting for processes to exit. Sep 4 17:52:10.513083 systemd-logind[1545]: Removed session 23. Sep 4 17:52:13.651402 containerd[1564]: time="2025-09-04T17:52:13.651358154Z" level=info msg="TaskExit event in podsandbox handler container_id:\"185689874e428c9987e558bee86d514a724b8f5f5f4674f5cc23d74e0fa1b3bf\" id:\"ef61835cad28dcc19a81f09d5411e18e4971a3b8cbbae61551c8be1481f2f4f4\" pid:5669 exited_at:{seconds:1757008333 nanos:651045500}" Sep 4 17:52:15.515286 systemd[1]: Started sshd@23-10.0.0.60:22-10.0.0.1:45206.service - OpenSSH per-connection server daemon (10.0.0.1:45206). Sep 4 17:52:15.574736 sshd[5684]: Accepted publickey for core from 10.0.0.1 port 45206 ssh2: RSA SHA256:OKBSeTvmyBtsjC7tfD9ZMfOXcYBSWTNVjZSy1tvLNgs Sep 4 17:52:15.576720 sshd-session[5684]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 4 17:52:15.581550 systemd-logind[1545]: New session 24 of user core. Sep 4 17:52:15.587335 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 4 17:52:15.758834 sshd[5687]: Connection closed by 10.0.0.1 port 45206 Sep 4 17:52:15.759258 sshd-session[5684]: pam_unix(sshd:session): session closed for user core Sep 4 17:52:15.764801 systemd[1]: sshd@23-10.0.0.60:22-10.0.0.1:45206.service: Deactivated successfully. Sep 4 17:52:15.767434 systemd[1]: session-24.scope: Deactivated successfully. Sep 4 17:52:15.768397 systemd-logind[1545]: Session 24 logged out. Waiting for processes to exit. Sep 4 17:52:15.771037 systemd-logind[1545]: Removed session 24.