Mar 10 01:56:52.167245 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Mar 9 23:01:22 -00 2026 Mar 10 01:56:52.167277 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=bcd0808bf4ec60436f0ff2e8373a873eb88ae42d4ac26e6e6d81129499700895 Mar 10 01:56:52.167290 kernel: BIOS-provided physical RAM map: Mar 10 01:56:52.167302 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 10 01:56:52.167311 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Mar 10 01:56:52.167320 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Mar 10 01:56:52.167330 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Mar 10 01:56:52.167339 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Mar 10 01:56:52.167348 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Mar 10 01:56:52.167357 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Mar 10 01:56:52.167366 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable Mar 10 01:56:52.167375 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Mar 10 01:56:52.167388 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable Mar 10 01:56:52.167397 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Mar 10 01:56:52.167408 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Mar 10 01:56:52.167418 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Mar 10 01:56:52.167519 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce90fff] usable Mar 10 01:56:52.167536 kernel: BIOS-e820: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Mar 10 01:56:52.167546 kernel: BIOS-e820: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Mar 10 01:56:52.167555 kernel: BIOS-e820: [mem 0x000000009ce97000-0x000000009cedbfff] usable Mar 10 01:56:52.167565 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Mar 10 01:56:52.171773 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Mar 10 01:56:52.171797 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 10 01:56:52.171807 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 10 01:56:52.171816 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 10 01:56:52.171825 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 10 01:56:52.171834 kernel: NX (Execute Disable) protection: active Mar 10 01:56:52.171843 kernel: APIC: Static calls initialized Mar 10 01:56:52.171861 kernel: e820: update [mem 0x9b320018-0x9b329c57] usable ==> usable Mar 10 01:56:52.171870 kernel: e820: update [mem 0x9b2e3018-0x9b31fe57] usable ==> usable Mar 10 01:56:52.171879 kernel: extended physical RAM map: Mar 10 01:56:52.171888 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 10 01:56:52.171897 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Mar 10 01:56:52.171906 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Mar 10 01:56:52.171915 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Mar 10 01:56:52.171924 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Mar 10 01:56:52.171933 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Mar 10 01:56:52.171942 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Mar 10 01:56:52.171951 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b2e3017] usable Mar 10 01:56:52.171964 kernel: reserve setup_data: [mem 0x000000009b2e3018-0x000000009b31fe57] usable Mar 10 01:56:52.171977 kernel: reserve setup_data: [mem 0x000000009b31fe58-0x000000009b320017] usable Mar 10 01:56:52.171987 kernel: reserve setup_data: [mem 0x000000009b320018-0x000000009b329c57] usable Mar 10 01:56:52.171997 kernel: reserve setup_data: [mem 0x000000009b329c58-0x000000009bd3efff] usable Mar 10 01:56:52.172006 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Mar 10 01:56:52.172019 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable Mar 10 01:56:52.172029 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Mar 10 01:56:52.172038 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Mar 10 01:56:52.172050 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Mar 10 01:56:52.172059 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce90fff] usable Mar 10 01:56:52.172069 kernel: reserve setup_data: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Mar 10 01:56:52.172079 kernel: reserve setup_data: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Mar 10 01:56:52.172088 kernel: reserve setup_data: [mem 0x000000009ce97000-0x000000009cedbfff] usable Mar 10 01:56:52.172098 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Mar 10 01:56:52.172107 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Mar 10 01:56:52.172117 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 10 01:56:52.172129 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 10 01:56:52.172139 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 10 01:56:52.172148 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 10 01:56:52.172158 kernel: efi: EFI v2.7 by EDK II Mar 10 01:56:52.172167 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b9e4198 RNG=0x9cb73018 Mar 10 01:56:52.172210 kernel: random: crng init done Mar 10 01:56:52.172221 kernel: efi: Remove mem151: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Mar 10 01:56:52.172251 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Mar 10 01:56:52.172262 kernel: secureboot: Secure boot disabled Mar 10 01:56:52.172271 kernel: SMBIOS 2.8 present. Mar 10 01:56:52.172281 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Mar 10 01:56:52.172293 kernel: DMI: Memory slots populated: 1/1 Mar 10 01:56:52.172303 kernel: Hypervisor detected: KVM Mar 10 01:56:52.172312 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 Mar 10 01:56:52.172321 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 10 01:56:52.172331 kernel: kvm-clock: using sched offset of 23577392249 cycles Mar 10 01:56:52.172342 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 10 01:56:52.172352 kernel: tsc: Detected 2445.426 MHz processor Mar 10 01:56:52.172362 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 10 01:56:52.172372 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 10 01:56:52.172381 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 Mar 10 01:56:52.172391 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Mar 10 01:56:52.172404 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 10 01:56:52.172414 kernel: Using GB pages for direct mapping Mar 10 01:56:52.172424 kernel: ACPI: Early table checksum verification disabled Mar 10 01:56:52.172433 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) Mar 10 01:56:52.172505 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Mar 10 01:56:52.172518 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 10 01:56:52.172528 kernel: ACPI: DSDT 0x000000009CB7A000 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 10 01:56:52.172538 kernel: ACPI: FACS 0x000000009CBDD000 000040 Mar 10 01:56:52.172552 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 10 01:56:52.172562 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 10 01:56:52.172572 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 10 01:56:52.175155 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 10 01:56:52.175169 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) Mar 10 01:56:52.175180 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] Mar 10 01:56:52.175191 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1b9] Mar 10 01:56:52.175201 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] Mar 10 01:56:52.175213 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] Mar 10 01:56:52.175229 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] Mar 10 01:56:52.175239 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] Mar 10 01:56:52.175252 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] Mar 10 01:56:52.175262 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] Mar 10 01:56:52.175274 kernel: No NUMA configuration found Mar 10 01:56:52.175284 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] Mar 10 01:56:52.175295 kernel: NODE_DATA(0) allocated [mem 0x9ce36dc0-0x9ce3dfff] Mar 10 01:56:52.175307 kernel: Zone ranges: Mar 10 01:56:52.175319 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 10 01:56:52.175334 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] Mar 10 01:56:52.175346 kernel: Normal empty Mar 10 01:56:52.175356 kernel: Device empty Mar 10 01:56:52.175367 kernel: Movable zone start for each node Mar 10 01:56:52.175380 kernel: Early memory node ranges Mar 10 01:56:52.175390 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 10 01:56:52.175437 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Mar 10 01:56:52.175517 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Mar 10 01:56:52.175530 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Mar 10 01:56:52.175544 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] Mar 10 01:56:52.175557 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] Mar 10 01:56:52.175567 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce90fff] Mar 10 01:56:52.175622 kernel: node 0: [mem 0x000000009ce97000-0x000000009cedbfff] Mar 10 01:56:52.175633 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] Mar 10 01:56:52.175645 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 10 01:56:52.175667 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 10 01:56:52.175681 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Mar 10 01:56:52.175692 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 10 01:56:52.175702 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Mar 10 01:56:52.175713 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Mar 10 01:56:52.175725 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Mar 10 01:56:52.175739 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Mar 10 01:56:52.175752 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges Mar 10 01:56:52.175764 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 10 01:56:52.175775 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 10 01:56:52.175787 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 10 01:56:52.175801 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 10 01:56:52.175814 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 10 01:56:52.175824 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 10 01:56:52.175836 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 10 01:56:52.175847 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 10 01:56:52.175859 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 10 01:56:52.175871 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 10 01:56:52.175882 kernel: TSC deadline timer available Mar 10 01:56:52.175894 kernel: CPU topo: Max. logical packages: 1 Mar 10 01:56:52.175909 kernel: CPU topo: Max. logical dies: 1 Mar 10 01:56:52.175921 kernel: CPU topo: Max. dies per package: 1 Mar 10 01:56:52.175934 kernel: CPU topo: Max. threads per core: 1 Mar 10 01:56:52.175945 kernel: CPU topo: Num. cores per package: 4 Mar 10 01:56:52.175958 kernel: CPU topo: Num. threads per package: 4 Mar 10 01:56:52.175969 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Mar 10 01:56:52.175981 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 10 01:56:52.175993 kernel: kvm-guest: KVM setup pv remote TLB flush Mar 10 01:56:52.176004 kernel: kvm-guest: setup PV sched yield Mar 10 01:56:52.176019 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices Mar 10 01:56:52.176032 kernel: Booting paravirtualized kernel on KVM Mar 10 01:56:52.176043 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 10 01:56:52.176055 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Mar 10 01:56:52.176067 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Mar 10 01:56:52.176079 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Mar 10 01:56:52.176091 kernel: pcpu-alloc: [0] 0 1 2 3 Mar 10 01:56:52.176102 kernel: kvm-guest: PV spinlocks enabled Mar 10 01:56:52.176114 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 10 01:56:52.176165 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=bcd0808bf4ec60436f0ff2e8373a873eb88ae42d4ac26e6e6d81129499700895 Mar 10 01:56:52.176179 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 10 01:56:52.176191 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 10 01:56:52.176203 kernel: Fallback order for Node 0: 0 Mar 10 01:56:52.176215 kernel: Built 1 zonelists, mobility grouping on. Total pages: 641450 Mar 10 01:56:52.176226 kernel: Policy zone: DMA32 Mar 10 01:56:52.176237 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 10 01:56:52.176248 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 10 01:56:52.176264 kernel: ftrace: allocating 40099 entries in 157 pages Mar 10 01:56:52.176276 kernel: ftrace: allocated 157 pages with 5 groups Mar 10 01:56:52.176289 kernel: Dynamic Preempt: voluntary Mar 10 01:56:52.176299 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 10 01:56:52.176313 kernel: rcu: RCU event tracing is enabled. Mar 10 01:56:52.176324 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 10 01:56:52.176336 kernel: Trampoline variant of Tasks RCU enabled. Mar 10 01:56:52.176347 kernel: Rude variant of Tasks RCU enabled. Mar 10 01:56:52.176358 kernel: Tracing variant of Tasks RCU enabled. Mar 10 01:56:52.176369 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 10 01:56:52.176385 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 10 01:56:52.176395 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 10 01:56:52.176407 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 10 01:56:52.176418 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 10 01:56:52.176428 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Mar 10 01:56:52.176439 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 10 01:56:52.176523 kernel: Console: colour dummy device 80x25 Mar 10 01:56:52.176535 kernel: printk: legacy console [ttyS0] enabled Mar 10 01:56:52.176547 kernel: ACPI: Core revision 20240827 Mar 10 01:56:52.176562 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 10 01:56:52.176573 kernel: APIC: Switch to symmetric I/O mode setup Mar 10 01:56:52.178633 kernel: x2apic enabled Mar 10 01:56:52.178644 kernel: APIC: Switched APIC routing to: physical x2apic Mar 10 01:56:52.178656 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Mar 10 01:56:52.178669 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Mar 10 01:56:52.178681 kernel: kvm-guest: setup PV IPIs Mar 10 01:56:52.178691 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 10 01:56:52.178701 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fd7ba1b0, max_idle_ns: 440795295779 ns Mar 10 01:56:52.178717 kernel: Calibrating delay loop (skipped) preset value.. 4890.85 BogoMIPS (lpj=2445426) Mar 10 01:56:52.178727 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 10 01:56:52.178737 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 10 01:56:52.178747 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 10 01:56:52.178756 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 10 01:56:52.178767 kernel: Spectre V2 : Mitigation: Retpolines Mar 10 01:56:52.178780 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Mar 10 01:56:52.178789 kernel: Speculative Store Bypass: Vulnerable Mar 10 01:56:52.178803 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Mar 10 01:56:52.178814 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Mar 10 01:56:52.178867 kernel: active return thunk: srso_alias_return_thunk Mar 10 01:56:52.178880 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Mar 10 01:56:52.178892 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Mar 10 01:56:52.178902 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Mar 10 01:56:52.178912 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 10 01:56:52.178922 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 10 01:56:52.178931 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 10 01:56:52.178946 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 10 01:56:52.178955 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Mar 10 01:56:52.178965 kernel: Freeing SMP alternatives memory: 32K Mar 10 01:56:52.178975 kernel: pid_max: default: 32768 minimum: 301 Mar 10 01:56:52.178988 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 10 01:56:52.178998 kernel: landlock: Up and running. Mar 10 01:56:52.179008 kernel: SELinux: Initializing. Mar 10 01:56:52.179018 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 10 01:56:52.179028 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 10 01:56:52.179042 kernel: smpboot: CPU0: AMD EPYC 7763 64-Core Processor (family: 0x19, model: 0x1, stepping: 0x1) Mar 10 01:56:52.179051 kernel: Performance Events: PMU not available due to virtualization, using software events only. Mar 10 01:56:52.179061 kernel: signal: max sigframe size: 1776 Mar 10 01:56:52.179071 kernel: rcu: Hierarchical SRCU implementation. Mar 10 01:56:52.180158 kernel: rcu: Max phase no-delay instances is 400. Mar 10 01:56:52.180179 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 10 01:56:52.180190 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 10 01:56:52.180201 kernel: smp: Bringing up secondary CPUs ... Mar 10 01:56:52.180212 kernel: smpboot: x86: Booting SMP configuration: Mar 10 01:56:52.180227 kernel: .... node #0, CPUs: #1 #2 #3 Mar 10 01:56:52.180238 kernel: smp: Brought up 1 node, 4 CPUs Mar 10 01:56:52.180250 kernel: smpboot: Total of 4 processors activated (19563.40 BogoMIPS) Mar 10 01:56:52.180264 kernel: Memory: 2414476K/2565800K available (14336K kernel code, 2445K rwdata, 26064K rodata, 46204K init, 2556K bss, 145388K reserved, 0K cma-reserved) Mar 10 01:56:52.180274 kernel: devtmpfs: initialized Mar 10 01:56:52.180284 kernel: x86/mm: Memory block size: 128MB Mar 10 01:56:52.180294 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Mar 10 01:56:52.180303 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Mar 10 01:56:52.180318 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Mar 10 01:56:52.180327 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) Mar 10 01:56:52.180337 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce95000-0x9ce96fff] (8192 bytes) Mar 10 01:56:52.180347 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) Mar 10 01:56:52.180358 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 10 01:56:52.180371 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 10 01:56:52.180381 kernel: pinctrl core: initialized pinctrl subsystem Mar 10 01:56:52.180391 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 10 01:56:52.180400 kernel: audit: initializing netlink subsys (disabled) Mar 10 01:56:52.180414 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 10 01:56:52.180424 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 10 01:56:52.180434 kernel: audit: type=2000 audit(1773107800.433:1): state=initialized audit_enabled=0 res=1 Mar 10 01:56:52.181770 kernel: cpuidle: using governor menu Mar 10 01:56:52.181787 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 10 01:56:52.181838 kernel: dca service started, version 1.12.1 Mar 10 01:56:52.181852 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Mar 10 01:56:52.181866 kernel: PCI: Using configuration type 1 for base access Mar 10 01:56:52.181876 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 10 01:56:52.181891 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 10 01:56:52.181901 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 10 01:56:52.181911 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 10 01:56:52.181920 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 10 01:56:52.181930 kernel: ACPI: Added _OSI(Module Device) Mar 10 01:56:52.181939 kernel: ACPI: Added _OSI(Processor Device) Mar 10 01:56:52.181949 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 10 01:56:52.181962 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 10 01:56:52.181974 kernel: ACPI: Interpreter enabled Mar 10 01:56:52.181989 kernel: ACPI: PM: (supports S0 S3 S5) Mar 10 01:56:52.182000 kernel: ACPI: Using IOAPIC for interrupt routing Mar 10 01:56:52.182013 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 10 01:56:52.182025 kernel: PCI: Using E820 reservations for host bridge windows Mar 10 01:56:52.182036 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 10 01:56:52.182046 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 10 01:56:52.184151 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 10 01:56:52.184348 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 10 01:56:52.185808 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 10 01:56:52.185829 kernel: PCI host bridge to bus 0000:00 Mar 10 01:56:52.186130 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 10 01:56:52.186293 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 10 01:56:52.187669 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 10 01:56:52.187829 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] Mar 10 01:56:52.188034 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Mar 10 01:56:52.188188 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] Mar 10 01:56:52.188328 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 10 01:56:52.190968 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Mar 10 01:56:52.191265 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Mar 10 01:56:52.191438 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] Mar 10 01:56:52.193840 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] Mar 10 01:56:52.194128 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Mar 10 01:56:52.194386 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 10 01:56:52.195012 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Mar 10 01:56:52.195204 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] Mar 10 01:56:52.195404 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] Mar 10 01:56:52.195713 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] Mar 10 01:56:52.196027 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Mar 10 01:56:52.196269 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] Mar 10 01:56:52.196436 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] Mar 10 01:56:52.203257 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] Mar 10 01:56:52.203685 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Mar 10 01:56:52.203863 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] Mar 10 01:56:52.204082 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] Mar 10 01:56:52.204258 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] Mar 10 01:56:52.204444 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] Mar 10 01:56:52.204878 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Mar 10 01:56:52.205062 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 10 01:56:52.205416 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Mar 10 01:56:52.205809 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] Mar 10 01:56:52.206000 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] Mar 10 01:56:52.206316 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Mar 10 01:56:52.207145 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] Mar 10 01:56:52.207169 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 10 01:56:52.207181 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 10 01:56:52.207193 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 10 01:56:52.207205 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 10 01:56:52.207216 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 10 01:56:52.207229 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 10 01:56:52.207247 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 10 01:56:52.207259 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 10 01:56:52.207271 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 10 01:56:52.207285 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 10 01:56:52.207382 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 10 01:56:52.207396 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 10 01:56:52.207408 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 10 01:56:52.207420 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 10 01:56:52.207431 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 10 01:56:52.207612 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 10 01:56:52.207628 kernel: iommu: Default domain type: Translated Mar 10 01:56:52.207639 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 10 01:56:52.207651 kernel: efivars: Registered efivars operations Mar 10 01:56:52.207662 kernel: PCI: Using ACPI for IRQ routing Mar 10 01:56:52.207674 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 10 01:56:52.207686 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Mar 10 01:56:52.207697 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Mar 10 01:56:52.207708 kernel: e820: reserve RAM buffer [mem 0x9b2e3018-0x9bffffff] Mar 10 01:56:52.207718 kernel: e820: reserve RAM buffer [mem 0x9b320018-0x9bffffff] Mar 10 01:56:52.207733 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] Mar 10 01:56:52.207744 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] Mar 10 01:56:52.207756 kernel: e820: reserve RAM buffer [mem 0x9ce91000-0x9fffffff] Mar 10 01:56:52.207767 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] Mar 10 01:56:52.207947 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 10 01:56:52.208110 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 10 01:56:52.208267 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 10 01:56:52.208285 kernel: vgaarb: loaded Mar 10 01:56:52.208297 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 10 01:56:52.208309 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 10 01:56:52.208320 kernel: clocksource: Switched to clocksource kvm-clock Mar 10 01:56:52.208332 kernel: VFS: Disk quotas dquot_6.6.0 Mar 10 01:56:52.208382 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 10 01:56:52.208395 kernel: pnp: PnP ACPI init Mar 10 01:56:52.209543 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved Mar 10 01:56:52.209564 kernel: pnp: PnP ACPI: found 6 devices Mar 10 01:56:52.209628 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 10 01:56:52.209640 kernel: NET: Registered PF_INET protocol family Mar 10 01:56:52.209650 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 10 01:56:52.209661 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 10 01:56:52.209671 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 10 01:56:52.209682 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 10 01:56:52.209713 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 10 01:56:52.209727 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 10 01:56:52.209740 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 10 01:56:52.209753 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 10 01:56:52.209855 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 10 01:56:52.209868 kernel: NET: Registered PF_XDP protocol family Mar 10 01:56:52.210055 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window Mar 10 01:56:52.210231 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned Mar 10 01:56:52.210389 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 10 01:56:52.210749 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 10 01:56:52.210928 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 10 01:56:52.211223 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] Mar 10 01:56:52.211404 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Mar 10 01:56:52.211812 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] Mar 10 01:56:52.211833 kernel: PCI: CLS 0 bytes, default 64 Mar 10 01:56:52.211847 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fd7ba1b0, max_idle_ns: 440795295779 ns Mar 10 01:56:52.211861 kernel: Initialise system trusted keyrings Mar 10 01:56:52.211874 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 10 01:56:52.211886 kernel: Key type asymmetric registered Mar 10 01:56:52.211904 kernel: Asymmetric key parser 'x509' registered Mar 10 01:56:52.211916 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 10 01:56:52.211929 kernel: io scheduler mq-deadline registered Mar 10 01:56:52.211942 kernel: io scheduler kyber registered Mar 10 01:56:52.211954 kernel: io scheduler bfq registered Mar 10 01:56:52.211967 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 10 01:56:52.211980 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 10 01:56:52.211993 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 10 01:56:52.212005 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 10 01:56:52.212023 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 10 01:56:52.212035 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 10 01:56:52.212047 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 10 01:56:52.212059 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 10 01:56:52.212071 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 10 01:56:52.212083 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 10 01:56:52.212443 kernel: rtc_cmos 00:04: RTC can wake from S4 Mar 10 01:56:52.213042 kernel: rtc_cmos 00:04: registered as rtc0 Mar 10 01:56:52.213289 kernel: rtc_cmos 00:04: setting system clock to 2026-03-10T01:56:48 UTC (1773107808) Mar 10 01:56:52.216713 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Mar 10 01:56:52.216742 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 10 01:56:52.216756 kernel: efifb: probing for efifb Mar 10 01:56:52.216768 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k Mar 10 01:56:52.216786 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Mar 10 01:56:52.216798 kernel: efifb: scrolling: redraw Mar 10 01:56:52.216810 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 10 01:56:52.216825 kernel: Console: switching to colour frame buffer device 160x50 Mar 10 01:56:52.216837 kernel: fb0: EFI VGA frame buffer device Mar 10 01:56:52.216849 kernel: pstore: Using crash dump compression: deflate Mar 10 01:56:52.216861 kernel: pstore: Registered efi_pstore as persistent store backend Mar 10 01:56:52.216873 kernel: NET: Registered PF_INET6 protocol family Mar 10 01:56:52.216885 kernel: Segment Routing with IPv6 Mar 10 01:56:52.216896 kernel: In-situ OAM (IOAM) with IPv6 Mar 10 01:56:52.216911 kernel: NET: Registered PF_PACKET protocol family Mar 10 01:56:52.216922 kernel: Key type dns_resolver registered Mar 10 01:56:52.216935 kernel: IPI shorthand broadcast: enabled Mar 10 01:56:52.216947 kernel: sched_clock: Marking stable (8378034705, 2785929306)->(12323843401, -1159879390) Mar 10 01:56:52.216959 kernel: registered taskstats version 1 Mar 10 01:56:52.216971 kernel: Loading compiled-in X.509 certificates Mar 10 01:56:52.216984 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 64a6e3ad023f02465a8c66e81554b4b2e64fb972' Mar 10 01:56:52.216995 kernel: Demotion targets for Node 0: null Mar 10 01:56:52.217007 kernel: Key type .fscrypt registered Mar 10 01:56:52.217022 kernel: Key type fscrypt-provisioning registered Mar 10 01:56:52.217033 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 10 01:56:52.217048 kernel: ima: Allocated hash algorithm: sha1 Mar 10 01:56:52.217060 kernel: ima: No architecture policies found Mar 10 01:56:52.217072 kernel: clk: Disabling unused clocks Mar 10 01:56:52.217083 kernel: Warning: unable to open an initial console. Mar 10 01:56:52.217096 kernel: Freeing unused kernel image (initmem) memory: 46204K Mar 10 01:56:52.217107 kernel: Write protecting the kernel read-only data: 40960k Mar 10 01:56:52.217122 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Mar 10 01:56:52.217133 kernel: Run /init as init process Mar 10 01:56:52.217145 kernel: with arguments: Mar 10 01:56:52.217157 kernel: /init Mar 10 01:56:52.217169 kernel: with environment: Mar 10 01:56:52.217180 kernel: HOME=/ Mar 10 01:56:52.217191 kernel: TERM=linux Mar 10 01:56:52.217205 systemd[1]: Successfully made /usr/ read-only. Mar 10 01:56:52.217222 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 10 01:56:52.217238 systemd[1]: Detected virtualization kvm. Mar 10 01:56:52.217250 systemd[1]: Detected architecture x86-64. Mar 10 01:56:52.217262 systemd[1]: Running in initrd. Mar 10 01:56:52.217273 systemd[1]: No hostname configured, using default hostname. Mar 10 01:56:52.217287 systemd[1]: Hostname set to . Mar 10 01:56:52.217298 systemd[1]: Initializing machine ID from VM UUID. Mar 10 01:56:52.217311 systemd[1]: Queued start job for default target initrd.target. Mar 10 01:56:52.217326 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 10 01:56:52.217339 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 10 01:56:52.217352 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 10 01:56:52.217365 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 10 01:56:52.217377 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 10 01:56:52.217391 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 10 01:56:52.217404 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 10 01:56:52.217420 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 10 01:56:52.217432 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 10 01:56:52.217444 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 10 01:56:52.217532 systemd[1]: Reached target paths.target - Path Units. Mar 10 01:56:52.217545 systemd[1]: Reached target slices.target - Slice Units. Mar 10 01:56:52.217557 systemd[1]: Reached target swap.target - Swaps. Mar 10 01:56:52.217569 systemd[1]: Reached target timers.target - Timer Units. Mar 10 01:56:52.217624 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 10 01:56:52.217637 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 10 01:56:52.217656 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 10 01:56:52.217669 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 10 01:56:52.217682 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 10 01:56:52.217694 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 10 01:56:52.217706 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 10 01:56:52.217719 systemd[1]: Reached target sockets.target - Socket Units. Mar 10 01:56:52.217731 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 10 01:56:52.217744 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 10 01:56:52.217761 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 10 01:56:52.217774 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 10 01:56:52.217785 systemd[1]: Starting systemd-fsck-usr.service... Mar 10 01:56:52.217796 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 10 01:56:52.217809 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 10 01:56:52.217821 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 10 01:56:52.217834 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 10 01:56:52.217852 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 10 01:56:52.217865 systemd[1]: Finished systemd-fsck-usr.service. Mar 10 01:56:52.217970 systemd-journald[203]: Collecting audit messages is disabled. Mar 10 01:56:52.218011 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 10 01:56:52.218025 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 10 01:56:52.218040 systemd-journald[203]: Journal started Mar 10 01:56:52.218068 systemd-journald[203]: Runtime Journal (/run/log/journal/53d474e57ce94272990f6cc9b859a4f4) is 6M, max 48.1M, 42.1M free. Mar 10 01:56:52.205979 systemd-modules-load[204]: Inserted module 'overlay' Mar 10 01:56:52.260977 systemd[1]: Started systemd-journald.service - Journal Service. Mar 10 01:56:52.285110 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 10 01:56:52.312851 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 10 01:56:52.374725 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 10 01:56:52.375865 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 10 01:56:52.413190 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 10 01:56:52.448940 kernel: Bridge firewalling registered Mar 10 01:56:52.444148 systemd-modules-load[204]: Inserted module 'br_netfilter' Mar 10 01:56:52.464880 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 10 01:56:52.473632 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 10 01:56:52.474301 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 10 01:56:52.494223 systemd-tmpfiles[227]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 10 01:56:52.502932 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 10 01:56:52.566932 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 10 01:56:52.577072 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 10 01:56:52.619443 dracut-cmdline[237]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=bcd0808bf4ec60436f0ff2e8373a873eb88ae42d4ac26e6e6d81129499700895 Mar 10 01:56:52.654687 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 10 01:56:52.681900 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 10 01:56:52.811812 systemd-resolved[267]: Positive Trust Anchors: Mar 10 01:56:52.811854 systemd-resolved[267]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 10 01:56:52.811896 systemd-resolved[267]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 10 01:56:52.823329 systemd-resolved[267]: Defaulting to hostname 'linux'. Mar 10 01:56:52.830882 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 10 01:56:52.945676 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 10 01:56:53.242649 kernel: SCSI subsystem initialized Mar 10 01:56:53.274764 kernel: Loading iSCSI transport class v2.0-870. Mar 10 01:56:53.326964 kernel: iscsi: registered transport (tcp) Mar 10 01:56:53.396283 kernel: iscsi: registered transport (qla4xxx) Mar 10 01:56:53.396367 kernel: QLogic iSCSI HBA Driver Mar 10 01:56:53.522242 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 10 01:56:53.586372 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 10 01:56:53.606337 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 10 01:56:53.852199 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 10 01:56:53.866927 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 10 01:56:54.037697 kernel: raid6: avx2x4 gen() 13482 MB/s Mar 10 01:56:54.059701 kernel: raid6: avx2x2 gen() 14359 MB/s Mar 10 01:56:54.089701 kernel: raid6: avx2x1 gen() 8951 MB/s Mar 10 01:56:54.089795 kernel: raid6: using algorithm avx2x2 gen() 14359 MB/s Mar 10 01:56:54.106040 kernel: raid6: .... xor() 11493 MB/s, rmw enabled Mar 10 01:56:54.106124 kernel: raid6: using avx2x2 recovery algorithm Mar 10 01:56:54.147334 kernel: xor: automatically using best checksumming function avx Mar 10 01:56:54.812726 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 10 01:56:54.844415 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 10 01:56:54.866948 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 10 01:56:54.967880 systemd-udevd[452]: Using default interface naming scheme 'v255'. Mar 10 01:56:54.986718 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 10 01:56:54.991854 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 10 01:56:55.135841 dracut-pre-trigger[454]: rd.md=0: removing MD RAID activation Mar 10 01:56:55.324082 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 10 01:56:55.350423 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 10 01:56:55.629221 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 10 01:56:55.656014 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 10 01:56:55.916106 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 10 01:56:55.916255 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 10 01:56:55.949263 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 10 01:56:55.981174 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 10 01:56:55.997993 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 10 01:56:56.032994 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 10 01:56:56.033176 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 10 01:56:56.107106 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 10 01:56:56.136910 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Mar 10 01:56:56.180025 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Mar 10 01:56:56.194531 kernel: cryptd: max_cpu_qlen set to 1000 Mar 10 01:56:56.215333 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 10 01:56:56.215402 kernel: GPT:9289727 != 19775487 Mar 10 01:56:56.215418 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 10 01:56:56.215432 kernel: GPT:9289727 != 19775487 Mar 10 01:56:56.215445 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 10 01:56:56.215533 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 10 01:56:56.316270 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 10 01:56:56.402527 kernel: libata version 3.00 loaded. Mar 10 01:56:56.550824 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Mar 10 01:56:56.598555 kernel: AES CTR mode by8 optimization enabled Mar 10 01:56:56.641644 kernel: ahci 0000:00:1f.2: version 3.0 Mar 10 01:56:56.656540 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 10 01:56:56.669638 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 10 01:56:56.720358 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Mar 10 01:56:56.721555 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Mar 10 01:56:56.722197 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 10 01:56:56.744557 kernel: scsi host0: ahci Mar 10 01:56:56.764954 kernel: scsi host1: ahci Mar 10 01:56:56.771715 kernel: scsi host2: ahci Mar 10 01:56:56.776579 kernel: scsi host3: ahci Mar 10 01:56:56.780867 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 10 01:56:56.877901 kernel: scsi host4: ahci Mar 10 01:56:56.878259 kernel: scsi host5: ahci Mar 10 01:56:56.878655 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 1 Mar 10 01:56:56.878676 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 1 Mar 10 01:56:56.878691 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 1 Mar 10 01:56:56.878704 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 1 Mar 10 01:56:56.878716 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 1 Mar 10 01:56:56.878741 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 1 Mar 10 01:56:56.928028 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 10 01:56:56.963201 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 10 01:56:57.012436 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 10 01:56:57.037914 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 10 01:56:57.148138 disk-uuid[620]: Primary Header is updated. Mar 10 01:56:57.148138 disk-uuid[620]: Secondary Entries is updated. Mar 10 01:56:57.148138 disk-uuid[620]: Secondary Header is updated. Mar 10 01:56:57.205750 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 10 01:56:57.206415 kernel: GPT:disk_guids don't match. Mar 10 01:56:57.206589 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 10 01:56:57.209545 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 10 01:56:57.209861 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 10 01:56:57.256704 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 10 01:56:57.256852 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 10 01:56:57.263717 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 10 01:56:57.282114 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 10 01:56:57.290508 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 10 01:56:57.293778 kernel: ata3.00: LPM support broken, forcing max_power Mar 10 01:56:57.293798 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 10 01:56:57.293812 kernel: ata3.00: applying bridge limits Mar 10 01:56:57.299719 kernel: ata3.00: LPM support broken, forcing max_power Mar 10 01:56:57.319225 kernel: ata3.00: configured for UDMA/100 Mar 10 01:56:57.331806 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 10 01:56:57.566218 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 10 01:56:57.572681 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 10 01:56:57.628691 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Mar 10 01:56:58.274016 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 10 01:56:58.280957 disk-uuid[621]: The operation has completed successfully. Mar 10 01:56:58.557094 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 10 01:56:58.557377 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 10 01:56:58.594437 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 10 01:56:58.802762 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 10 01:56:58.810314 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 10 01:56:58.819933 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 10 01:56:58.831106 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 10 01:56:58.836879 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 10 01:56:58.897735 sh[649]: Success Mar 10 01:56:58.912574 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 10 01:56:58.953036 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 10 01:56:58.953129 kernel: device-mapper: uevent: version 1.0.3 Mar 10 01:56:58.957264 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 10 01:56:59.014330 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Mar 10 01:56:59.144996 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 10 01:56:59.168826 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 10 01:56:59.183734 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 10 01:56:59.307591 kernel: BTRFS: device fsid 91a17919-8e0b-4e39-b5e3-1547b6175986 devid 1 transid 37 /dev/mapper/usr (253:0) scanned by mount (668) Mar 10 01:56:59.325736 kernel: BTRFS info (device dm-0): first mount of filesystem 91a17919-8e0b-4e39-b5e3-1547b6175986 Mar 10 01:56:59.325816 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 10 01:56:59.430594 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 10 01:56:59.431879 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 10 01:56:59.441900 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 10 01:56:59.452264 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 10 01:56:59.474984 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 10 01:56:59.481963 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 10 01:56:59.558933 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 10 01:56:59.661730 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (691) Mar 10 01:56:59.694894 kernel: BTRFS info (device vda6): first mount of filesystem ee81d5fa-b10d-48ad-a53f-95a2476266f6 Mar 10 01:56:59.694981 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 10 01:56:59.742335 kernel: BTRFS info (device vda6): turning on async discard Mar 10 01:56:59.742417 kernel: BTRFS info (device vda6): enabling free space tree Mar 10 01:56:59.776793 kernel: BTRFS info (device vda6): last unmount of filesystem ee81d5fa-b10d-48ad-a53f-95a2476266f6 Mar 10 01:56:59.805773 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 10 01:56:59.833023 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 10 01:57:00.264937 ignition[750]: Ignition 2.22.0 Mar 10 01:57:00.264992 ignition[750]: Stage: fetch-offline Mar 10 01:57:00.265044 ignition[750]: no configs at "/usr/lib/ignition/base.d" Mar 10 01:57:00.265058 ignition[750]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 10 01:57:00.265177 ignition[750]: parsed url from cmdline: "" Mar 10 01:57:00.265183 ignition[750]: no config URL provided Mar 10 01:57:00.265191 ignition[750]: reading system config file "/usr/lib/ignition/user.ign" Mar 10 01:57:00.265204 ignition[750]: no config at "/usr/lib/ignition/user.ign" Mar 10 01:57:00.265245 ignition[750]: op(1): [started] loading QEMU firmware config module Mar 10 01:57:00.265252 ignition[750]: op(1): executing: "modprobe" "qemu_fw_cfg" Mar 10 01:57:00.316864 ignition[750]: op(1): [finished] loading QEMU firmware config module Mar 10 01:57:00.331879 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 10 01:57:00.375867 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 10 01:57:00.517121 systemd-networkd[846]: lo: Link UP Mar 10 01:57:00.517183 systemd-networkd[846]: lo: Gained carrier Mar 10 01:57:00.524173 systemd-networkd[846]: Enumeration completed Mar 10 01:57:00.524709 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 10 01:57:00.527940 systemd-networkd[846]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 10 01:57:00.527947 systemd-networkd[846]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 10 01:57:00.533332 systemd-networkd[846]: eth0: Link UP Mar 10 01:57:00.535718 systemd-networkd[846]: eth0: Gained carrier Mar 10 01:57:00.535737 systemd-networkd[846]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 10 01:57:00.633947 systemd[1]: Reached target network.target - Network. Mar 10 01:57:00.676680 systemd-networkd[846]: eth0: DHCPv4 address 10.0.0.74/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 10 01:57:00.830916 systemd-resolved[267]: Detected conflict on linux IN A 10.0.0.74 Mar 10 01:57:00.830979 systemd-resolved[267]: Hostname conflict, changing published hostname from 'linux' to 'linux2'. Mar 10 01:57:00.871817 ignition[750]: parsing config with SHA512: 89ae9471b1d212ea33e0fc6d4299fdd9fe101659a8f91499303accd8855bce54985418143c3518dcee1b74857467d0230847d3dc720e26107fea42a84a957c76 Mar 10 01:57:00.908318 unknown[750]: fetched base config from "system" Mar 10 01:57:00.908330 unknown[750]: fetched user config from "qemu" Mar 10 01:57:00.914423 ignition[750]: fetch-offline: fetch-offline passed Mar 10 01:57:00.914657 ignition[750]: Ignition finished successfully Mar 10 01:57:00.957810 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 10 01:57:00.969025 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 10 01:57:00.972406 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 10 01:57:01.122307 ignition[851]: Ignition 2.22.0 Mar 10 01:57:01.122962 ignition[851]: Stage: kargs Mar 10 01:57:01.126997 ignition[851]: no configs at "/usr/lib/ignition/base.d" Mar 10 01:57:01.150186 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 10 01:57:01.127015 ignition[851]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 10 01:57:01.132335 ignition[851]: kargs: kargs passed Mar 10 01:57:01.132405 ignition[851]: Ignition finished successfully Mar 10 01:57:01.202408 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 10 01:57:01.295253 ignition[860]: Ignition 2.22.0 Mar 10 01:57:01.295344 ignition[860]: Stage: disks Mar 10 01:57:01.295707 ignition[860]: no configs at "/usr/lib/ignition/base.d" Mar 10 01:57:01.295723 ignition[860]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 10 01:57:01.301082 ignition[860]: disks: disks passed Mar 10 01:57:01.301154 ignition[860]: Ignition finished successfully Mar 10 01:57:01.345710 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 10 01:57:01.370039 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 10 01:57:01.376055 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 10 01:57:01.415202 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 10 01:57:01.434114 systemd[1]: Reached target sysinit.target - System Initialization. Mar 10 01:57:01.446728 systemd[1]: Reached target basic.target - Basic System. Mar 10 01:57:01.470720 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 10 01:57:01.597307 systemd-fsck[869]: ROOT: clean, 15/553520 files, 52789/553472 blocks Mar 10 01:57:01.613138 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 10 01:57:01.634104 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 10 01:57:01.785844 systemd-networkd[846]: eth0: Gained IPv6LL Mar 10 01:57:02.477846 kernel: EXT4-fs (vda9): mounted filesystem 494bf987-03e9-4980-9fc3-4af435e63ebe r/w with ordered data mode. Quota mode: none. Mar 10 01:57:02.480357 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 10 01:57:02.498947 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 10 01:57:02.542279 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 10 01:57:02.576838 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 10 01:57:02.597393 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 10 01:57:02.623836 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 10 01:57:02.623917 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 10 01:57:02.676538 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (877) Mar 10 01:57:02.687574 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 10 01:57:02.747703 kernel: BTRFS info (device vda6): first mount of filesystem ee81d5fa-b10d-48ad-a53f-95a2476266f6 Mar 10 01:57:02.747740 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 10 01:57:02.747758 kernel: BTRFS info (device vda6): turning on async discard Mar 10 01:57:02.747773 kernel: BTRFS info (device vda6): enabling free space tree Mar 10 01:57:02.727380 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 10 01:57:02.753063 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 10 01:57:03.772872 kernel: clocksource: Long readout interval, skipping watchdog check: cs_nsec: 1130559466 wd_nsec: 1130559404 Mar 10 01:57:03.829418 initrd-setup-root[901]: cut: /sysroot/etc/passwd: No such file or directory Mar 10 01:57:03.870840 initrd-setup-root[908]: cut: /sysroot/etc/group: No such file or directory Mar 10 01:57:03.889749 initrd-setup-root[915]: cut: /sysroot/etc/shadow: No such file or directory Mar 10 01:57:03.982209 initrd-setup-root[922]: cut: /sysroot/etc/gshadow: No such file or directory Mar 10 01:57:04.580987 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 10 01:57:04.609116 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 10 01:57:04.664877 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 10 01:57:04.709333 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 10 01:57:04.723364 kernel: BTRFS info (device vda6): last unmount of filesystem ee81d5fa-b10d-48ad-a53f-95a2476266f6 Mar 10 01:57:04.772717 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 10 01:57:04.847259 ignition[990]: INFO : Ignition 2.22.0 Mar 10 01:57:04.859907 ignition[990]: INFO : Stage: mount Mar 10 01:57:04.859907 ignition[990]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 10 01:57:04.859907 ignition[990]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 10 01:57:04.894229 ignition[990]: INFO : mount: mount passed Mar 10 01:57:04.894229 ignition[990]: INFO : Ignition finished successfully Mar 10 01:57:04.893239 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 10 01:57:04.910269 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 10 01:57:04.984771 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 10 01:57:05.027356 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1003) Mar 10 01:57:05.038918 kernel: BTRFS info (device vda6): first mount of filesystem ee81d5fa-b10d-48ad-a53f-95a2476266f6 Mar 10 01:57:05.038984 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 10 01:57:05.074001 kernel: BTRFS info (device vda6): turning on async discard Mar 10 01:57:05.074133 kernel: BTRFS info (device vda6): enabling free space tree Mar 10 01:57:05.079131 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 10 01:57:05.198704 ignition[1020]: INFO : Ignition 2.22.0 Mar 10 01:57:05.198704 ignition[1020]: INFO : Stage: files Mar 10 01:57:05.210152 ignition[1020]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 10 01:57:05.210152 ignition[1020]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 10 01:57:05.210152 ignition[1020]: DEBUG : files: compiled without relabeling support, skipping Mar 10 01:57:05.210152 ignition[1020]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 10 01:57:05.210152 ignition[1020]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 10 01:57:05.256720 ignition[1020]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 10 01:57:05.256720 ignition[1020]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 10 01:57:05.256720 ignition[1020]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 10 01:57:05.256720 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 10 01:57:05.256720 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 10 01:57:05.232692 unknown[1020]: wrote ssh authorized keys file for user: core Mar 10 01:57:05.355571 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 10 01:57:05.739545 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 10 01:57:05.767566 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 10 01:57:05.787535 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 10 01:57:05.787535 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 10 01:57:05.787535 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 10 01:57:05.787535 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 10 01:57:05.895783 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 10 01:57:05.895783 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 10 01:57:05.895783 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 10 01:57:05.895783 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 10 01:57:05.895783 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 10 01:57:05.895783 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 10 01:57:05.895783 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 10 01:57:05.895783 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 10 01:57:05.895783 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-x86-64.raw: attempt #1 Mar 10 01:57:06.165599 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 10 01:57:09.290580 ignition[1020]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 10 01:57:09.290580 ignition[1020]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 10 01:57:09.318221 ignition[1020]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 10 01:57:09.337380 ignition[1020]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 10 01:57:09.337380 ignition[1020]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 10 01:57:09.337380 ignition[1020]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 10 01:57:09.337380 ignition[1020]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 10 01:57:09.337380 ignition[1020]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 10 01:57:09.337380 ignition[1020]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 10 01:57:09.337380 ignition[1020]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Mar 10 01:57:09.509566 ignition[1020]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 10 01:57:09.534236 ignition[1020]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 10 01:57:09.547354 ignition[1020]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Mar 10 01:57:09.547354 ignition[1020]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Mar 10 01:57:09.547354 ignition[1020]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Mar 10 01:57:09.547354 ignition[1020]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 10 01:57:09.547354 ignition[1020]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 10 01:57:09.547354 ignition[1020]: INFO : files: files passed Mar 10 01:57:09.547354 ignition[1020]: INFO : Ignition finished successfully Mar 10 01:57:09.542379 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 10 01:57:09.624741 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 10 01:57:09.629270 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 10 01:57:09.688609 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 10 01:57:09.688818 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 10 01:57:09.711923 initrd-setup-root-after-ignition[1048]: grep: /sysroot/oem/oem-release: No such file or directory Mar 10 01:57:09.718070 initrd-setup-root-after-ignition[1051]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 10 01:57:09.718070 initrd-setup-root-after-ignition[1051]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 10 01:57:09.734942 initrd-setup-root-after-ignition[1055]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 10 01:57:09.726550 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 10 01:57:09.735569 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 10 01:57:09.762418 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 10 01:57:10.055731 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 10 01:57:10.056076 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 10 01:57:10.074520 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 10 01:57:10.083384 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 10 01:57:10.090317 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 10 01:57:10.107167 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 10 01:57:10.363557 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 10 01:57:10.385222 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 10 01:57:10.601053 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 10 01:57:10.621729 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 10 01:57:10.660685 systemd[1]: Stopped target timers.target - Timer Units. Mar 10 01:57:10.670357 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 10 01:57:10.675314 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 10 01:57:10.704630 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 10 01:57:10.710099 systemd[1]: Stopped target basic.target - Basic System. Mar 10 01:57:10.727559 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 10 01:57:10.736925 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 10 01:57:10.745093 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 10 01:57:10.775273 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 10 01:57:10.804362 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 10 01:57:10.817027 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 10 01:57:10.847405 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 10 01:57:10.869329 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 10 01:57:10.890562 systemd[1]: Stopped target swap.target - Swaps. Mar 10 01:57:10.897693 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 10 01:57:10.901396 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 10 01:57:10.923808 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 10 01:57:10.954956 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 10 01:57:10.969827 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 10 01:57:10.971740 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 10 01:57:10.982321 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 10 01:57:10.982730 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 10 01:57:10.997997 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 10 01:57:10.999890 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 10 01:57:11.027536 systemd[1]: Stopped target paths.target - Path Units. Mar 10 01:57:11.042727 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 10 01:57:11.045139 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 10 01:57:11.056315 systemd[1]: Stopped target slices.target - Slice Units. Mar 10 01:57:11.075893 systemd[1]: Stopped target sockets.target - Socket Units. Mar 10 01:57:11.099924 systemd[1]: iscsid.socket: Deactivated successfully. Mar 10 01:57:11.100179 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 10 01:57:11.103619 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 10 01:57:11.103845 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 10 01:57:11.134315 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 10 01:57:11.134614 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 10 01:57:11.158175 systemd[1]: ignition-files.service: Deactivated successfully. Mar 10 01:57:11.162117 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 10 01:57:11.209195 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 10 01:57:11.238890 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 10 01:57:11.281294 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 10 01:57:11.281742 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 10 01:57:11.306135 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 10 01:57:11.306371 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 10 01:57:11.377559 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 10 01:57:11.377968 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 10 01:57:11.429808 ignition[1075]: INFO : Ignition 2.22.0 Mar 10 01:57:11.429808 ignition[1075]: INFO : Stage: umount Mar 10 01:57:11.429808 ignition[1075]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 10 01:57:11.429808 ignition[1075]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 10 01:57:11.461183 ignition[1075]: INFO : umount: umount passed Mar 10 01:57:11.461183 ignition[1075]: INFO : Ignition finished successfully Mar 10 01:57:11.445885 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 10 01:57:11.476368 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 10 01:57:11.476987 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 10 01:57:11.494319 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 10 01:57:11.494578 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 10 01:57:11.518183 systemd[1]: Stopped target network.target - Network. Mar 10 01:57:11.531769 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 10 01:57:11.531938 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 10 01:57:11.539080 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 10 01:57:11.539179 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 10 01:57:11.544890 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 10 01:57:11.544986 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 10 01:57:11.603074 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 10 01:57:11.603373 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 10 01:57:11.609369 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 10 01:57:11.609543 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 10 01:57:11.628715 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 10 01:57:11.670126 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 10 01:57:11.674393 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 10 01:57:11.674829 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 10 01:57:11.711934 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 10 01:57:11.728593 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 10 01:57:11.740089 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 10 01:57:11.740183 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 10 01:57:11.744870 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 10 01:57:11.747015 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 10 01:57:11.747103 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 10 01:57:11.747300 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 10 01:57:11.747884 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 10 01:57:11.750778 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 10 01:57:11.864242 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 10 01:57:11.877312 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 10 01:57:11.878045 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 10 01:57:11.932352 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 10 01:57:11.932543 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 10 01:57:11.969811 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 10 01:57:11.970065 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 10 01:57:11.978788 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 10 01:57:11.978882 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 10 01:57:11.993163 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 10 01:57:11.993244 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 10 01:57:12.001807 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 10 01:57:12.001915 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 10 01:57:12.017523 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 10 01:57:12.021881 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 10 01:57:12.021997 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 10 01:57:12.102830 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 10 01:57:12.102963 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 10 01:57:12.136229 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 10 01:57:12.136332 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 10 01:57:12.169388 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 10 01:57:12.169626 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 10 01:57:12.212171 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 10 01:57:12.212262 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 10 01:57:12.255227 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 10 01:57:12.255317 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 10 01:57:12.272228 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 10 01:57:12.277299 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 10 01:57:12.357865 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 10 01:57:12.357963 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 10 01:57:12.490606 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 10 01:57:12.491089 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 10 01:57:12.491212 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Mar 10 01:57:12.491326 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 10 01:57:12.492325 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 10 01:57:12.492411 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 10 01:57:12.502441 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 10 01:57:12.506398 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 10 01:57:12.583938 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 10 01:57:12.584277 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 10 01:57:12.613376 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 10 01:57:12.642137 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 10 01:57:12.731842 systemd[1]: Switching root. Mar 10 01:57:12.816425 systemd-journald[203]: Journal stopped Mar 10 01:57:18.140835 systemd-journald[203]: Received SIGTERM from PID 1 (systemd). Mar 10 01:57:18.140916 kernel: SELinux: policy capability network_peer_controls=1 Mar 10 01:57:18.140935 kernel: SELinux: policy capability open_perms=1 Mar 10 01:57:18.140956 kernel: SELinux: policy capability extended_socket_class=1 Mar 10 01:57:18.140971 kernel: SELinux: policy capability always_check_network=0 Mar 10 01:57:18.140986 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 10 01:57:18.141007 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 10 01:57:18.141021 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 10 01:57:18.141035 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 10 01:57:18.141057 kernel: SELinux: policy capability userspace_initial_context=0 Mar 10 01:57:18.141081 kernel: audit: type=1403 audit(1773107833.507:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 10 01:57:18.141104 systemd[1]: Successfully loaded SELinux policy in 277.172ms. Mar 10 01:57:18.141132 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 39.177ms. Mar 10 01:57:18.141150 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 10 01:57:18.141166 systemd[1]: Detected virtualization kvm. Mar 10 01:57:18.141183 systemd[1]: Detected architecture x86-64. Mar 10 01:57:18.141198 systemd[1]: Detected first boot. Mar 10 01:57:18.141215 systemd[1]: Initializing machine ID from VM UUID. Mar 10 01:57:18.141236 zram_generator::config[1126]: No configuration found. Mar 10 01:57:18.141254 kernel: Guest personality initialized and is inactive Mar 10 01:57:18.141272 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Mar 10 01:57:18.141288 kernel: Initialized host personality Mar 10 01:57:18.141303 kernel: NET: Registered PF_VSOCK protocol family Mar 10 01:57:18.141319 systemd[1]: Populated /etc with preset unit settings. Mar 10 01:57:18.141336 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 10 01:57:18.141352 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 10 01:57:18.141367 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 10 01:57:18.141383 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 10 01:57:18.141399 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 10 01:57:18.141419 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 10 01:57:18.141436 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 10 01:57:18.141531 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 10 01:57:18.141550 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 10 01:57:18.141575 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 10 01:57:18.141592 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 10 01:57:18.141608 systemd[1]: Created slice user.slice - User and Session Slice. Mar 10 01:57:18.141623 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 10 01:57:18.141640 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 10 01:57:18.143800 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 10 01:57:18.143825 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 10 01:57:18.143844 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 10 01:57:18.143862 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 10 01:57:18.143878 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 10 01:57:18.143894 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 10 01:57:18.143910 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 10 01:57:18.143931 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 10 01:57:18.143948 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 10 01:57:18.143964 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 10 01:57:18.143980 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 10 01:57:18.143997 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 10 01:57:18.144013 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 10 01:57:18.144029 systemd[1]: Reached target slices.target - Slice Units. Mar 10 01:57:18.144049 systemd[1]: Reached target swap.target - Swaps. Mar 10 01:57:18.144065 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 10 01:57:18.144085 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 10 01:57:18.144102 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 10 01:57:18.144118 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 10 01:57:18.144134 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 10 01:57:18.144150 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 10 01:57:18.144167 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 10 01:57:18.144183 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 10 01:57:18.144198 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 10 01:57:18.144214 systemd[1]: Mounting media.mount - External Media Directory... Mar 10 01:57:18.144233 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 10 01:57:18.144249 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 10 01:57:18.144264 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 10 01:57:18.144280 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 10 01:57:18.144303 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 10 01:57:18.144319 systemd[1]: Reached target machines.target - Containers. Mar 10 01:57:18.144334 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 10 01:57:18.144351 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 10 01:57:18.144370 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 10 01:57:18.144386 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 10 01:57:18.144405 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 10 01:57:18.144421 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 10 01:57:18.144436 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 10 01:57:18.144527 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 10 01:57:18.144547 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 10 01:57:18.144563 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 10 01:57:18.144579 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 10 01:57:18.144600 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 10 01:57:18.144615 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 10 01:57:18.144631 systemd[1]: Stopped systemd-fsck-usr.service. Mar 10 01:57:18.144647 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 10 01:57:18.144711 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 10 01:57:18.144729 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 10 01:57:18.144745 kernel: fuse: init (API version 7.41) Mar 10 01:57:18.144761 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 10 01:57:18.144778 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 10 01:57:18.144798 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 10 01:57:18.144818 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 10 01:57:18.144837 systemd[1]: verity-setup.service: Deactivated successfully. Mar 10 01:57:18.144854 systemd[1]: Stopped verity-setup.service. Mar 10 01:57:18.144871 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 10 01:57:18.144890 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 10 01:57:18.144906 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 10 01:57:18.144922 systemd[1]: Mounted media.mount - External Media Directory. Mar 10 01:57:18.144968 systemd-journald[1211]: Collecting audit messages is disabled. Mar 10 01:57:18.148791 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 10 01:57:18.148824 systemd-journald[1211]: Journal started Mar 10 01:57:18.148859 systemd-journald[1211]: Runtime Journal (/run/log/journal/53d474e57ce94272990f6cc9b859a4f4) is 6M, max 48.1M, 42.1M free. Mar 10 01:57:16.278235 systemd[1]: Queued start job for default target multi-user.target. Mar 10 01:57:16.312945 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 10 01:57:16.314422 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 10 01:57:16.316377 systemd[1]: systemd-journald.service: Consumed 1.537s CPU time. Mar 10 01:57:18.168777 systemd[1]: Started systemd-journald.service - Journal Service. Mar 10 01:57:18.182332 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 10 01:57:18.191519 kernel: ACPI: bus type drm_connector registered Mar 10 01:57:18.196217 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 10 01:57:18.206546 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 10 01:57:18.220604 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 10 01:57:18.230329 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 10 01:57:18.230950 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 10 01:57:18.249600 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 10 01:57:18.250188 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 10 01:57:18.257890 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 10 01:57:18.258567 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 10 01:57:18.267292 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 10 01:57:18.275706 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 10 01:57:18.287891 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 10 01:57:18.288235 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 10 01:57:18.313418 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 10 01:57:18.331923 kernel: loop: module loaded Mar 10 01:57:18.315592 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 10 01:57:18.336421 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 10 01:57:18.350416 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 10 01:57:18.352632 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 10 01:57:18.370207 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 10 01:57:18.387318 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 10 01:57:18.428088 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 10 01:57:18.440881 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 10 01:57:18.457073 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 10 01:57:18.468916 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 10 01:57:18.469018 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 10 01:57:18.486789 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 10 01:57:18.508796 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 10 01:57:18.521004 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 10 01:57:18.548739 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 10 01:57:18.571726 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 10 01:57:18.583940 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 10 01:57:18.607075 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 10 01:57:18.617991 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 10 01:57:18.624700 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 10 01:57:18.658213 systemd-journald[1211]: Time spent on flushing to /var/log/journal/53d474e57ce94272990f6cc9b859a4f4 is 65.242ms for 1070 entries. Mar 10 01:57:18.658213 systemd-journald[1211]: System Journal (/var/log/journal/53d474e57ce94272990f6cc9b859a4f4) is 8M, max 195.6M, 187.6M free. Mar 10 01:57:18.751724 systemd-journald[1211]: Received client request to flush runtime journal. Mar 10 01:57:18.751779 kernel: loop0: detected capacity change from 0 to 110984 Mar 10 01:57:18.647104 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 10 01:57:18.675832 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 10 01:57:18.696612 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 10 01:57:18.708968 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 10 01:57:18.715637 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 10 01:57:18.729009 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 10 01:57:18.737744 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 10 01:57:18.749588 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 10 01:57:18.757435 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 10 01:57:18.783650 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 10 01:57:18.790601 systemd-tmpfiles[1247]: ACLs are not supported, ignoring. Mar 10 01:57:18.790628 systemd-tmpfiles[1247]: ACLs are not supported, ignoring. Mar 10 01:57:18.800041 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 10 01:57:18.818227 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 10 01:57:18.840576 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 10 01:57:18.842203 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 10 01:57:18.850081 kernel: loop1: detected capacity change from 0 to 128560 Mar 10 01:57:18.923587 kernel: loop2: detected capacity change from 0 to 219192 Mar 10 01:57:18.941261 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 10 01:57:18.958771 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 10 01:57:19.026966 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Mar 10 01:57:19.031399 kernel: loop3: detected capacity change from 0 to 110984 Mar 10 01:57:19.026995 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Mar 10 01:57:19.045943 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 10 01:57:19.110894 kernel: loop4: detected capacity change from 0 to 128560 Mar 10 01:57:19.165561 kernel: loop5: detected capacity change from 0 to 219192 Mar 10 01:57:19.209801 (sd-merge)[1272]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Mar 10 01:57:19.210879 (sd-merge)[1272]: Merged extensions into '/usr'. Mar 10 01:57:19.226139 systemd[1]: Reload requested from client PID 1246 ('systemd-sysext') (unit systemd-sysext.service)... Mar 10 01:57:19.226191 systemd[1]: Reloading... Mar 10 01:57:19.343718 zram_generator::config[1305]: No configuration found. Mar 10 01:57:19.612813 ldconfig[1241]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 10 01:57:19.624977 systemd[1]: Reloading finished in 397 ms. Mar 10 01:57:19.667085 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 10 01:57:19.672712 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 10 01:57:19.682746 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 10 01:57:19.725256 systemd[1]: Starting ensure-sysext.service... Mar 10 01:57:19.731861 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 10 01:57:19.737599 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 10 01:57:19.756897 systemd[1]: Reload requested from client PID 1337 ('systemctl') (unit ensure-sysext.service)... Mar 10 01:57:19.756948 systemd[1]: Reloading... Mar 10 01:57:19.757642 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 10 01:57:19.757769 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 10 01:57:19.758216 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 10 01:57:19.758764 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 10 01:57:19.760427 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 10 01:57:19.761196 systemd-tmpfiles[1338]: ACLs are not supported, ignoring. Mar 10 01:57:19.762167 systemd-tmpfiles[1338]: ACLs are not supported, ignoring. Mar 10 01:57:19.770346 systemd-tmpfiles[1338]: Detected autofs mount point /boot during canonicalization of boot. Mar 10 01:57:19.770362 systemd-tmpfiles[1338]: Skipping /boot Mar 10 01:57:19.785079 systemd-tmpfiles[1338]: Detected autofs mount point /boot during canonicalization of boot. Mar 10 01:57:19.785116 systemd-tmpfiles[1338]: Skipping /boot Mar 10 01:57:19.807427 systemd-udevd[1339]: Using default interface naming scheme 'v255'. Mar 10 01:57:19.843560 zram_generator::config[1365]: No configuration found. Mar 10 01:57:19.999528 kernel: mousedev: PS/2 mouse device common for all mice Mar 10 01:57:20.039911 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 10 01:57:20.049542 kernel: ACPI: button: Power Button [PWRF] Mar 10 01:57:20.071564 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Mar 10 01:57:20.072112 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 10 01:57:20.073863 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 10 01:57:20.189736 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 10 01:57:20.190332 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 10 01:57:20.195614 systemd[1]: Reloading finished in 438 ms. Mar 10 01:57:20.207875 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 10 01:57:20.227097 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 10 01:57:20.349308 systemd[1]: Finished ensure-sysext.service. Mar 10 01:57:20.370787 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 10 01:57:20.374256 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 10 01:57:20.456872 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 10 01:57:20.461349 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 10 01:57:20.463716 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 10 01:57:20.472427 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 10 01:57:20.482853 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 10 01:57:20.495767 kernel: kvm_amd: TSC scaling supported Mar 10 01:57:20.495839 kernel: kvm_amd: Nested Virtualization enabled Mar 10 01:57:20.495881 kernel: kvm_amd: Nested Paging enabled Mar 10 01:57:20.496527 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Mar 10 01:57:20.496575 kernel: kvm_amd: PMU virtualization is disabled Mar 10 01:57:20.499990 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 10 01:57:20.519775 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 10 01:57:20.534780 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 10 01:57:20.540891 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 10 01:57:20.546105 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 10 01:57:20.595574 augenrules[1475]: No rules Mar 10 01:57:20.595754 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 10 01:57:20.607429 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 10 01:57:20.619074 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 10 01:57:20.626865 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 10 01:57:20.629845 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 10 01:57:20.630210 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 10 01:57:20.632042 systemd[1]: audit-rules.service: Deactivated successfully. Mar 10 01:57:20.632432 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 10 01:57:20.639252 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 10 01:57:20.639728 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 10 01:57:20.640274 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 10 01:57:20.640609 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 10 01:57:20.655789 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 10 01:57:20.662728 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 10 01:57:20.663885 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 10 01:57:20.670893 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 10 01:57:20.671437 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 10 01:57:20.676554 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 10 01:57:20.686596 kernel: EDAC MC: Ver: 3.0.0 Mar 10 01:57:20.688069 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 10 01:57:20.708164 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 10 01:57:20.708435 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 10 01:57:20.711372 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 10 01:57:20.721156 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 10 01:57:20.721302 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 10 01:57:20.724738 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 10 01:57:20.766315 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 10 01:57:20.828181 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 10 01:57:20.847646 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 10 01:57:21.004742 systemd-networkd[1476]: lo: Link UP Mar 10 01:57:21.004756 systemd-networkd[1476]: lo: Gained carrier Mar 10 01:57:21.007747 systemd-networkd[1476]: Enumeration completed Mar 10 01:57:21.007908 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 10 01:57:21.009216 systemd-networkd[1476]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 10 01:57:21.009258 systemd-networkd[1476]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 10 01:57:21.016068 systemd-networkd[1476]: eth0: Link UP Mar 10 01:57:21.016327 systemd-networkd[1476]: eth0: Gained carrier Mar 10 01:57:21.016351 systemd-networkd[1476]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 10 01:57:21.017391 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 10 01:57:21.018702 systemd-resolved[1481]: Positive Trust Anchors: Mar 10 01:57:21.018744 systemd-resolved[1481]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 10 01:57:21.018789 systemd-resolved[1481]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 10 01:57:21.026812 systemd-resolved[1481]: Defaulting to hostname 'linux'. Mar 10 01:57:21.029709 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 10 01:57:21.039585 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 10 01:57:21.044275 systemd[1]: Reached target network.target - Network. Mar 10 01:57:21.044629 systemd-networkd[1476]: eth0: DHCPv4 address 10.0.0.74/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 10 01:57:21.047743 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 10 01:57:21.061074 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 10 01:57:21.064638 systemd-timesyncd[1483]: Contacted time server 10.0.0.1:123 (10.0.0.1). Mar 10 01:57:21.064749 systemd-timesyncd[1483]: Initial clock synchronization to Tue 2026-03-10 01:57:21.374401 UTC. Mar 10 01:57:21.067414 systemd[1]: Reached target sysinit.target - System Initialization. Mar 10 01:57:21.073814 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 10 01:57:21.079114 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 10 01:57:21.085191 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Mar 10 01:57:21.090988 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 10 01:57:21.097032 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 10 01:57:21.097285 systemd[1]: Reached target paths.target - Path Units. Mar 10 01:57:21.102269 systemd[1]: Reached target time-set.target - System Time Set. Mar 10 01:57:21.105859 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 10 01:57:21.109552 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 10 01:57:21.113936 systemd[1]: Reached target timers.target - Timer Units. Mar 10 01:57:21.119556 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 10 01:57:21.129268 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 10 01:57:21.143158 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 10 01:57:21.149766 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 10 01:57:21.155966 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 10 01:57:21.167162 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 10 01:57:21.172280 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 10 01:57:21.181378 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 10 01:57:21.187590 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 10 01:57:21.196127 systemd[1]: Reached target sockets.target - Socket Units. Mar 10 01:57:21.200781 systemd[1]: Reached target basic.target - Basic System. Mar 10 01:57:21.205241 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 10 01:57:21.205377 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 10 01:57:21.211622 systemd[1]: Starting containerd.service - containerd container runtime... Mar 10 01:57:21.221042 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 10 01:57:21.240508 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 10 01:57:21.248410 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 10 01:57:21.255907 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 10 01:57:21.260260 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 10 01:57:21.262927 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Mar 10 01:57:21.263823 jq[1522]: false Mar 10 01:57:21.269747 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 10 01:57:21.277363 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 10 01:57:21.287733 extend-filesystems[1523]: Found /dev/vda6 Mar 10 01:57:21.296428 extend-filesystems[1523]: Found /dev/vda9 Mar 10 01:57:21.287890 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 10 01:57:21.297211 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 10 01:57:21.302590 oslogin_cache_refresh[1524]: Refreshing passwd entry cache Mar 10 01:57:21.305190 google_oslogin_nss_cache[1524]: oslogin_cache_refresh[1524]: Refreshing passwd entry cache Mar 10 01:57:21.309985 extend-filesystems[1523]: Checking size of /dev/vda9 Mar 10 01:57:21.312886 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 10 01:57:21.324863 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 10 01:57:21.325145 oslogin_cache_refresh[1524]: Failure getting users, quitting Mar 10 01:57:21.326821 google_oslogin_nss_cache[1524]: oslogin_cache_refresh[1524]: Failure getting users, quitting Mar 10 01:57:21.326821 google_oslogin_nss_cache[1524]: oslogin_cache_refresh[1524]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 10 01:57:21.326821 google_oslogin_nss_cache[1524]: oslogin_cache_refresh[1524]: Refreshing group entry cache Mar 10 01:57:21.325175 oslogin_cache_refresh[1524]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 10 01:57:21.325233 oslogin_cache_refresh[1524]: Refreshing group entry cache Mar 10 01:57:21.327002 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 10 01:57:21.330171 systemd[1]: Starting update-engine.service - Update Engine... Mar 10 01:57:21.342217 google_oslogin_nss_cache[1524]: oslogin_cache_refresh[1524]: Failure getting groups, quitting Mar 10 01:57:21.343916 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 10 01:57:21.345646 google_oslogin_nss_cache[1524]: oslogin_cache_refresh[1524]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 10 01:57:21.343981 oslogin_cache_refresh[1524]: Failure getting groups, quitting Mar 10 01:57:21.344007 oslogin_cache_refresh[1524]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 10 01:57:21.347440 extend-filesystems[1523]: Resized partition /dev/vda9 Mar 10 01:57:21.352230 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 10 01:57:21.358184 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 10 01:57:21.363543 extend-filesystems[1550]: resize2fs 1.47.3 (8-Jul-2025) Mar 10 01:57:21.365349 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 10 01:57:21.365977 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Mar 10 01:57:21.366314 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Mar 10 01:57:21.373742 systemd[1]: motdgen.service: Deactivated successfully. Mar 10 01:57:21.374108 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 10 01:57:21.377016 jq[1548]: true Mar 10 01:57:21.380540 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Mar 10 01:57:21.389127 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 10 01:57:21.389555 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 10 01:57:21.408853 update_engine[1544]: I20260310 01:57:21.408716 1544 main.cc:92] Flatcar Update Engine starting Mar 10 01:57:21.417560 (ntainerd)[1555]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 10 01:57:21.428574 jq[1554]: true Mar 10 01:57:21.450644 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 10 01:57:21.478875 tar[1551]: linux-amd64/LICENSE Mar 10 01:57:21.479285 tar[1551]: linux-amd64/helm Mar 10 01:57:21.484626 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Mar 10 01:57:21.499079 dbus-daemon[1520]: [system] SELinux support is enabled Mar 10 01:57:21.502836 update_engine[1544]: I20260310 01:57:21.501973 1544 update_check_scheduler.cc:74] Next update check in 5m31s Mar 10 01:57:21.504246 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 10 01:57:21.512517 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 10 01:57:21.512565 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 10 01:57:21.519288 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 10 01:57:21.519322 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 10 01:57:21.524365 systemd-logind[1537]: Watching system buttons on /dev/input/event2 (Power Button) Mar 10 01:57:21.524439 systemd-logind[1537]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 10 01:57:21.528910 systemd[1]: Started update-engine.service - Update Engine. Mar 10 01:57:21.529269 systemd-logind[1537]: New seat seat0. Mar 10 01:57:21.533350 systemd[1]: Started systemd-logind.service - User Login Management. Mar 10 01:57:21.536120 extend-filesystems[1550]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 10 01:57:21.536120 extend-filesystems[1550]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 10 01:57:21.536120 extend-filesystems[1550]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Mar 10 01:57:21.564183 extend-filesystems[1523]: Resized filesystem in /dev/vda9 Mar 10 01:57:21.539167 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 10 01:57:21.539607 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 10 01:57:21.567012 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 10 01:57:21.581213 bash[1582]: Updated "/home/core/.ssh/authorized_keys" Mar 10 01:57:21.585524 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 10 01:57:21.601077 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 10 01:57:21.973888 locksmithd[1585]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 10 01:57:22.098538 sshd_keygen[1545]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 10 01:57:22.161410 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 10 01:57:22.186443 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 10 01:57:22.192630 systemd[1]: Started sshd@0-10.0.0.74:22-10.0.0.1:39938.service - OpenSSH per-connection server daemon (10.0.0.1:39938). Mar 10 01:57:22.226892 systemd[1]: issuegen.service: Deactivated successfully. Mar 10 01:57:22.227402 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 10 01:57:22.238841 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 10 01:57:22.388778 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 10 01:57:22.400061 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 10 01:57:22.409102 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 10 01:57:22.417403 systemd[1]: Reached target getty.target - Login Prompts. Mar 10 01:57:22.522940 sshd[1605]: Accepted publickey for core from 10.0.0.1 port 39938 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 01:57:22.548632 sshd-session[1605]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 01:57:22.589153 containerd[1555]: time="2026-03-10T01:57:22Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 10 01:57:22.605915 containerd[1555]: time="2026-03-10T01:57:22.605548506Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 10 01:57:22.615774 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 10 01:57:22.626055 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.632859237Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="293.169µs" Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.632939161Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.632967810Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.633656052Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.633687084Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.633724674Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.633869058Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.633887727Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.634342148Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.634365657Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.634382162Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 10 01:57:22.634614 containerd[1555]: time="2026-03-10T01:57:22.634400291Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 10 01:57:22.635005 containerd[1555]: time="2026-03-10T01:57:22.634677662Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 10 01:57:22.635231 containerd[1555]: time="2026-03-10T01:57:22.635129960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 10 01:57:22.635277 containerd[1555]: time="2026-03-10T01:57:22.635221915Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 10 01:57:22.635277 containerd[1555]: time="2026-03-10T01:57:22.635245183Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 10 01:57:22.635405 containerd[1555]: time="2026-03-10T01:57:22.635385873Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 10 01:57:22.636259 containerd[1555]: time="2026-03-10T01:57:22.636163930Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 10 01:57:22.637324 containerd[1555]: time="2026-03-10T01:57:22.636341949Z" level=info msg="metadata content store policy set" policy=shared Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650171564Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650233723Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650248729Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650415737Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650441015Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650455355Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650470747Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650529077Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650544364Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650555343Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650565199Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 10 01:57:22.650618 containerd[1555]: time="2026-03-10T01:57:22.650580060Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 10 01:57:22.649694 systemd-logind[1537]: New session 1 of user core. Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.650879368Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.650911535Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.650983248Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.651005446Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.651020453Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.651036884Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.651056013Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.651070894Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.651085973Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.651196515Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.651241492Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.651356580Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 10 01:57:22.651428 containerd[1555]: time="2026-03-10T01:57:22.651388748Z" level=info msg="Start snapshots syncer" Mar 10 01:57:22.652153 containerd[1555]: time="2026-03-10T01:57:22.651554319Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 10 01:57:22.652153 containerd[1555]: time="2026-03-10T01:57:22.652074428Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 10 01:57:22.652368 containerd[1555]: time="2026-03-10T01:57:22.652181378Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 10 01:57:22.652368 containerd[1555]: time="2026-03-10T01:57:22.652244267Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 10 01:57:22.652428 containerd[1555]: time="2026-03-10T01:57:22.652400264Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 10 01:57:22.652428 containerd[1555]: time="2026-03-10T01:57:22.652422087Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 10 01:57:22.652566 containerd[1555]: time="2026-03-10T01:57:22.652432504Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 10 01:57:22.652566 containerd[1555]: time="2026-03-10T01:57:22.652441610Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 10 01:57:22.652566 containerd[1555]: time="2026-03-10T01:57:22.652455210Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 10 01:57:22.652566 containerd[1555]: time="2026-03-10T01:57:22.652464765Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 10 01:57:22.652566 containerd[1555]: time="2026-03-10T01:57:22.652474432Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 10 01:57:22.652777 containerd[1555]: time="2026-03-10T01:57:22.652729262Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 10 01:57:22.652815 containerd[1555]: time="2026-03-10T01:57:22.652776832Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 10 01:57:22.652815 containerd[1555]: time="2026-03-10T01:57:22.652794242Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 10 01:57:22.652937 containerd[1555]: time="2026-03-10T01:57:22.652895000Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 10 01:57:22.652937 containerd[1555]: time="2026-03-10T01:57:22.652933662Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 10 01:57:22.653048 containerd[1555]: time="2026-03-10T01:57:22.652942903Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 10 01:57:22.653048 containerd[1555]: time="2026-03-10T01:57:22.652952061Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 10 01:57:22.653048 containerd[1555]: time="2026-03-10T01:57:22.652959179Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 10 01:57:22.653048 containerd[1555]: time="2026-03-10T01:57:22.652968264Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 10 01:57:22.653048 containerd[1555]: time="2026-03-10T01:57:22.652986976Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 10 01:57:22.653192 containerd[1555]: time="2026-03-10T01:57:22.653095434Z" level=info msg="runtime interface created" Mar 10 01:57:22.653192 containerd[1555]: time="2026-03-10T01:57:22.653105727Z" level=info msg="created NRI interface" Mar 10 01:57:22.653192 containerd[1555]: time="2026-03-10T01:57:22.653115270Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 10 01:57:22.653192 containerd[1555]: time="2026-03-10T01:57:22.653149373Z" level=info msg="Connect containerd service" Mar 10 01:57:22.655806 containerd[1555]: time="2026-03-10T01:57:22.655705039Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 10 01:57:22.657471 containerd[1555]: time="2026-03-10T01:57:22.657388571Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 10 01:57:22.692670 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 10 01:57:22.704768 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 10 01:57:22.731364 (systemd)[1622]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 10 01:57:22.736360 systemd-logind[1537]: New session c1 of user core. Mar 10 01:57:22.973963 systemd-networkd[1476]: eth0: Gained IPv6LL Mar 10 01:57:22.982472 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 10 01:57:22.994383 systemd[1]: Reached target network-online.target - Network is Online. Mar 10 01:57:23.005834 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Mar 10 01:57:23.149772 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 10 01:57:23.173191 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 10 01:57:23.324079 systemd[1622]: Queued start job for default target default.target. Mar 10 01:57:23.331356 systemd[1622]: Created slice app.slice - User Application Slice. Mar 10 01:57:23.331433 systemd[1622]: Reached target paths.target - Paths. Mar 10 01:57:23.331597 systemd[1622]: Reached target timers.target - Timers. Mar 10 01:57:23.337662 systemd[1622]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 10 01:57:23.344641 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 10 01:57:23.361639 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 10 01:57:23.362021 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Mar 10 01:57:23.427382 kernel: hrtimer: interrupt took 5297270 ns Mar 10 01:57:23.470692 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 10 01:57:23.473255 systemd[1622]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 10 01:57:23.475306 systemd[1622]: Reached target sockets.target - Sockets. Mar 10 01:57:23.475683 systemd[1622]: Reached target basic.target - Basic System. Mar 10 01:57:23.475760 systemd[1622]: Reached target default.target - Main User Target. Mar 10 01:57:23.475815 systemd[1622]: Startup finished in 607ms. Mar 10 01:57:23.478690 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 10 01:57:23.488151 containerd[1555]: time="2026-03-10T01:57:23.488025371Z" level=info msg="Start subscribing containerd event" Mar 10 01:57:23.488238 containerd[1555]: time="2026-03-10T01:57:23.488154994Z" level=info msg="Start recovering state" Mar 10 01:57:23.489268 containerd[1555]: time="2026-03-10T01:57:23.488937290Z" level=info msg="Start event monitor" Mar 10 01:57:23.489373 containerd[1555]: time="2026-03-10T01:57:23.489331059Z" level=info msg="Start cni network conf syncer for default" Mar 10 01:57:23.489768 containerd[1555]: time="2026-03-10T01:57:23.489707095Z" level=info msg="Start streaming server" Mar 10 01:57:23.490777 containerd[1555]: time="2026-03-10T01:57:23.490716196Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 10 01:57:23.490871 containerd[1555]: time="2026-03-10T01:57:23.490816815Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 10 01:57:23.493032 containerd[1555]: time="2026-03-10T01:57:23.492972274Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 10 01:57:23.493032 containerd[1555]: time="2026-03-10T01:57:23.493010434Z" level=info msg="runtime interface starting up..." Mar 10 01:57:23.493032 containerd[1555]: time="2026-03-10T01:57:23.493018835Z" level=info msg="starting plugins..." Mar 10 01:57:23.493203 containerd[1555]: time="2026-03-10T01:57:23.493059604Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 10 01:57:23.493357 containerd[1555]: time="2026-03-10T01:57:23.493294436Z" level=info msg="containerd successfully booted in 0.941336s" Mar 10 01:57:23.494696 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 10 01:57:23.499734 systemd[1]: Started containerd.service - containerd container runtime. Mar 10 01:57:23.512353 tar[1551]: linux-amd64/README.md Mar 10 01:57:23.543053 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 10 01:57:23.562285 systemd[1]: Started sshd@1-10.0.0.74:22-10.0.0.1:39954.service - OpenSSH per-connection server daemon (10.0.0.1:39954). Mar 10 01:57:23.649943 sshd[1664]: Accepted publickey for core from 10.0.0.1 port 39954 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 01:57:23.652014 sshd-session[1664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 01:57:23.665377 systemd-logind[1537]: New session 2 of user core. Mar 10 01:57:23.680848 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 10 01:57:23.721128 sshd[1667]: Connection closed by 10.0.0.1 port 39954 Mar 10 01:57:23.723163 sshd-session[1664]: pam_unix(sshd:session): session closed for user core Mar 10 01:57:23.735819 systemd[1]: sshd@1-10.0.0.74:22-10.0.0.1:39954.service: Deactivated successfully. Mar 10 01:57:23.739819 systemd[1]: session-2.scope: Deactivated successfully. Mar 10 01:57:23.741799 systemd-logind[1537]: Session 2 logged out. Waiting for processes to exit. Mar 10 01:57:23.747867 systemd[1]: Started sshd@2-10.0.0.74:22-10.0.0.1:39956.service - OpenSSH per-connection server daemon (10.0.0.1:39956). Mar 10 01:57:23.755948 systemd-logind[1537]: Removed session 2. Mar 10 01:57:23.837766 sshd[1673]: Accepted publickey for core from 10.0.0.1 port 39956 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 01:57:23.840083 sshd-session[1673]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 01:57:23.848216 systemd-logind[1537]: New session 3 of user core. Mar 10 01:57:23.859825 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 10 01:57:23.889335 sshd[1676]: Connection closed by 10.0.0.1 port 39956 Mar 10 01:57:23.889078 sshd-session[1673]: pam_unix(sshd:session): session closed for user core Mar 10 01:57:23.897682 systemd[1]: sshd@2-10.0.0.74:22-10.0.0.1:39956.service: Deactivated successfully. Mar 10 01:57:23.901109 systemd[1]: session-3.scope: Deactivated successfully. Mar 10 01:57:23.902934 systemd-logind[1537]: Session 3 logged out. Waiting for processes to exit. Mar 10 01:57:23.905345 systemd-logind[1537]: Removed session 3. Mar 10 01:57:25.493238 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:57:25.501985 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 10 01:57:25.502419 (kubelet)[1685]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 10 01:57:25.514885 systemd[1]: Startup finished in 9.705s (kernel) + 23.251s (initrd) + 12.280s (userspace) = 45.238s. Mar 10 01:57:27.568867 kubelet[1685]: E0310 01:57:27.568340 1685 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 10 01:57:27.575927 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 10 01:57:27.576209 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 10 01:57:27.578978 systemd[1]: kubelet.service: Consumed 2.632s CPU time, 259.3M memory peak. Mar 10 01:57:34.133233 systemd[1]: Started sshd@3-10.0.0.74:22-10.0.0.1:39614.service - OpenSSH per-connection server daemon (10.0.0.1:39614). Mar 10 01:57:34.305980 sshd[1700]: Accepted publickey for core from 10.0.0.1 port 39614 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 01:57:34.310782 sshd-session[1700]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 01:57:34.342842 systemd-logind[1537]: New session 4 of user core. Mar 10 01:57:34.360975 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 10 01:57:34.416153 sshd[1703]: Connection closed by 10.0.0.1 port 39614 Mar 10 01:57:34.417003 sshd-session[1700]: pam_unix(sshd:session): session closed for user core Mar 10 01:57:34.438988 systemd[1]: sshd@3-10.0.0.74:22-10.0.0.1:39614.service: Deactivated successfully. Mar 10 01:57:34.455411 systemd[1]: session-4.scope: Deactivated successfully. Mar 10 01:57:34.460608 systemd-logind[1537]: Session 4 logged out. Waiting for processes to exit. Mar 10 01:57:34.468886 systemd[1]: Started sshd@4-10.0.0.74:22-10.0.0.1:39630.service - OpenSSH per-connection server daemon (10.0.0.1:39630). Mar 10 01:57:34.476793 systemd-logind[1537]: Removed session 4. Mar 10 01:57:34.647253 sshd[1709]: Accepted publickey for core from 10.0.0.1 port 39630 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 01:57:34.648331 sshd-session[1709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 01:57:34.686395 systemd-logind[1537]: New session 5 of user core. Mar 10 01:57:34.698949 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 10 01:57:34.751647 sshd[1712]: Connection closed by 10.0.0.1 port 39630 Mar 10 01:57:34.747688 sshd-session[1709]: pam_unix(sshd:session): session closed for user core Mar 10 01:57:34.783617 systemd[1]: sshd@4-10.0.0.74:22-10.0.0.1:39630.service: Deactivated successfully. Mar 10 01:57:34.788528 systemd[1]: session-5.scope: Deactivated successfully. Mar 10 01:57:34.798038 systemd-logind[1537]: Session 5 logged out. Waiting for processes to exit. Mar 10 01:57:34.802944 systemd-logind[1537]: Removed session 5. Mar 10 01:57:34.807839 systemd[1]: Started sshd@5-10.0.0.74:22-10.0.0.1:39632.service - OpenSSH per-connection server daemon (10.0.0.1:39632). Mar 10 01:57:34.968051 sshd[1718]: Accepted publickey for core from 10.0.0.1 port 39632 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 01:57:34.974155 sshd-session[1718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 01:57:35.001666 systemd-logind[1537]: New session 6 of user core. Mar 10 01:57:35.023979 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 10 01:57:35.074094 sshd[1721]: Connection closed by 10.0.0.1 port 39632 Mar 10 01:57:35.077304 sshd-session[1718]: pam_unix(sshd:session): session closed for user core Mar 10 01:57:35.111395 systemd[1]: sshd@5-10.0.0.74:22-10.0.0.1:39632.service: Deactivated successfully. Mar 10 01:57:35.121745 systemd[1]: session-6.scope: Deactivated successfully. Mar 10 01:57:35.128918 systemd-logind[1537]: Session 6 logged out. Waiting for processes to exit. Mar 10 01:57:35.136678 systemd[1]: Started sshd@6-10.0.0.74:22-10.0.0.1:39646.service - OpenSSH per-connection server daemon (10.0.0.1:39646). Mar 10 01:57:35.143899 systemd-logind[1537]: Removed session 6. Mar 10 01:57:35.286390 sshd[1727]: Accepted publickey for core from 10.0.0.1 port 39646 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 01:57:35.287286 sshd-session[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 01:57:35.319677 systemd-logind[1537]: New session 7 of user core. Mar 10 01:57:35.346419 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 10 01:57:35.449898 sudo[1731]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 10 01:57:35.452696 sudo[1731]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 10 01:57:35.510834 sudo[1731]: pam_unix(sudo:session): session closed for user root Mar 10 01:57:35.518277 sshd[1730]: Connection closed by 10.0.0.1 port 39646 Mar 10 01:57:35.521250 sshd-session[1727]: pam_unix(sshd:session): session closed for user core Mar 10 01:57:35.555186 systemd[1]: sshd@6-10.0.0.74:22-10.0.0.1:39646.service: Deactivated successfully. Mar 10 01:57:35.564999 systemd[1]: session-7.scope: Deactivated successfully. Mar 10 01:57:35.570868 systemd-logind[1537]: Session 7 logged out. Waiting for processes to exit. Mar 10 01:57:35.577782 systemd[1]: Started sshd@7-10.0.0.74:22-10.0.0.1:39648.service - OpenSSH per-connection server daemon (10.0.0.1:39648). Mar 10 01:57:35.586688 systemd-logind[1537]: Removed session 7. Mar 10 01:57:35.739424 sshd[1737]: Accepted publickey for core from 10.0.0.1 port 39648 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 01:57:35.742163 sshd-session[1737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 01:57:35.769597 systemd-logind[1537]: New session 8 of user core. Mar 10 01:57:35.783050 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 10 01:57:35.826661 sudo[1742]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 10 01:57:35.827216 sudo[1742]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 10 01:57:35.853076 sudo[1742]: pam_unix(sudo:session): session closed for user root Mar 10 01:57:35.883915 sudo[1741]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 10 01:57:35.884580 sudo[1741]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 10 01:57:35.929095 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 10 01:57:36.088384 augenrules[1764]: No rules Mar 10 01:57:36.095731 systemd[1]: audit-rules.service: Deactivated successfully. Mar 10 01:57:36.096132 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 10 01:57:36.102715 sudo[1741]: pam_unix(sudo:session): session closed for user root Mar 10 01:57:36.108052 sshd[1740]: Connection closed by 10.0.0.1 port 39648 Mar 10 01:57:36.109105 sshd-session[1737]: pam_unix(sshd:session): session closed for user core Mar 10 01:57:36.131212 systemd[1]: sshd@7-10.0.0.74:22-10.0.0.1:39648.service: Deactivated successfully. Mar 10 01:57:36.138624 systemd[1]: session-8.scope: Deactivated successfully. Mar 10 01:57:36.142295 systemd-logind[1537]: Session 8 logged out. Waiting for processes to exit. Mar 10 01:57:36.149818 systemd[1]: Started sshd@8-10.0.0.74:22-10.0.0.1:39652.service - OpenSSH per-connection server daemon (10.0.0.1:39652). Mar 10 01:57:36.157968 systemd-logind[1537]: Removed session 8. Mar 10 01:57:36.398586 sshd[1773]: Accepted publickey for core from 10.0.0.1 port 39652 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 01:57:36.398887 sshd-session[1773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 01:57:36.423684 systemd-logind[1537]: New session 9 of user core. Mar 10 01:57:36.447387 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 10 01:57:36.507767 sudo[1777]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 10 01:57:36.511274 sudo[1777]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 10 01:57:37.836290 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 10 01:57:37.858034 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 10 01:57:38.508921 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:57:38.543340 (kubelet)[1805]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 10 01:57:38.674387 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 10 01:57:38.730721 (dockerd)[1814]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 10 01:57:38.776952 kubelet[1805]: E0310 01:57:38.776371 1805 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 10 01:57:38.794519 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 10 01:57:38.794872 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 10 01:57:38.797926 systemd[1]: kubelet.service: Consumed 432ms CPU time, 110.9M memory peak. Mar 10 01:57:39.324959 dockerd[1814]: time="2026-03-10T01:57:39.322969288Z" level=info msg="Starting up" Mar 10 01:57:39.330790 dockerd[1814]: time="2026-03-10T01:57:39.330705923Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 10 01:57:39.375576 dockerd[1814]: time="2026-03-10T01:57:39.375522326Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 10 01:57:39.597379 dockerd[1814]: time="2026-03-10T01:57:39.596805321Z" level=info msg="Loading containers: start." Mar 10 01:57:39.621735 kernel: Initializing XFRM netlink socket Mar 10 01:57:40.796821 systemd-networkd[1476]: docker0: Link UP Mar 10 01:57:40.813250 dockerd[1814]: time="2026-03-10T01:57:40.811860661Z" level=info msg="Loading containers: done." Mar 10 01:57:40.868658 dockerd[1814]: time="2026-03-10T01:57:40.868357609Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 10 01:57:40.871023 dockerd[1814]: time="2026-03-10T01:57:40.868870975Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 10 01:57:40.871023 dockerd[1814]: time="2026-03-10T01:57:40.869029181Z" level=info msg="Initializing buildkit" Mar 10 01:57:40.978293 dockerd[1814]: time="2026-03-10T01:57:40.974785726Z" level=info msg="Completed buildkit initialization" Mar 10 01:57:40.996397 dockerd[1814]: time="2026-03-10T01:57:40.992924029Z" level=info msg="Daemon has completed initialization" Mar 10 01:57:40.996397 dockerd[1814]: time="2026-03-10T01:57:40.993022204Z" level=info msg="API listen on /run/docker.sock" Mar 10 01:57:40.993656 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 10 01:57:42.568546 containerd[1555]: time="2026-03-10T01:57:42.567828158Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 10 01:57:43.836834 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1506278045.mount: Deactivated successfully. Mar 10 01:57:48.844525 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 10 01:57:48.887960 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 10 01:57:50.669747 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:57:50.766663 (kubelet)[2097]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 10 01:57:51.495094 kubelet[2097]: E0310 01:57:51.491846 2097 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 10 01:57:51.504309 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 10 01:57:51.511867 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 10 01:57:51.514759 systemd[1]: kubelet.service: Consumed 1.215s CPU time, 110.8M memory peak. Mar 10 01:57:57.771423 containerd[1555]: time="2026-03-10T01:57:57.770589605Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:57:57.779045 containerd[1555]: time="2026-03-10T01:57:57.778516560Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=27074497" Mar 10 01:57:57.783610 containerd[1555]: time="2026-03-10T01:57:57.783463830Z" level=info msg="ImageCreate event name:\"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:57:57.796199 containerd[1555]: time="2026-03-10T01:57:57.795829463Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:57:57.803075 containerd[1555]: time="2026-03-10T01:57:57.802807112Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"27071096\" in 15.234639224s" Mar 10 01:57:57.803075 containerd[1555]: time="2026-03-10T01:57:57.802949985Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\"" Mar 10 01:57:57.807366 containerd[1555]: time="2026-03-10T01:57:57.807261316Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 10 01:58:01.582029 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 10 01:58:01.608243 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 10 01:58:02.457391 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:58:02.487914 (kubelet)[2118]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 10 01:58:03.120220 containerd[1555]: time="2026-03-10T01:58:03.116950143Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:03.138297 containerd[1555]: time="2026-03-10T01:58:03.137974543Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=21165823" Mar 10 01:58:03.167039 containerd[1555]: time="2026-03-10T01:58:03.160140935Z" level=info msg="ImageCreate event name:\"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:03.906230 containerd[1555]: time="2026-03-10T01:58:03.896300106Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:04.412907 containerd[1555]: time="2026-03-10T01:58:04.187742277Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"22822771\" in 6.347661234s" Mar 10 01:58:04.412907 containerd[1555]: time="2026-03-10T01:58:04.403181759Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\"" Mar 10 01:58:04.509783 containerd[1555]: time="2026-03-10T01:58:04.493783032Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 10 01:58:04.822741 kubelet[2118]: E0310 01:58:04.822396 2118 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 10 01:58:04.883328 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 10 01:58:05.004579 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 10 01:58:05.051260 systemd[1]: kubelet.service: Consumed 2.424s CPU time, 110.3M memory peak. Mar 10 01:58:06.323430 update_engine[1544]: I20260310 01:58:06.318266 1544 update_attempter.cc:509] Updating boot flags... Mar 10 01:58:11.015677 containerd[1555]: time="2026-03-10T01:58:11.009905733Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:11.015677 containerd[1555]: time="2026-03-10T01:58:11.016038204Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=15729824" Mar 10 01:58:11.019807 containerd[1555]: time="2026-03-10T01:58:11.019375211Z" level=info msg="ImageCreate event name:\"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:11.056525 containerd[1555]: time="2026-03-10T01:58:11.055787582Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:11.073051 containerd[1555]: time="2026-03-10T01:58:11.071798765Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"17386790\" in 6.559500954s" Mar 10 01:58:11.073051 containerd[1555]: time="2026-03-10T01:58:11.071888792Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\"" Mar 10 01:58:11.080064 containerd[1555]: time="2026-03-10T01:58:11.079989164Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 10 01:58:15.081434 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3453038205.mount: Deactivated successfully. Mar 10 01:58:15.098078 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 10 01:58:15.110680 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 10 01:58:16.388605 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:58:16.430125 (kubelet)[2162]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 10 01:58:16.714126 kubelet[2162]: E0310 01:58:16.712857 2162 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 10 01:58:16.726760 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 10 01:58:16.727015 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 10 01:58:16.727612 systemd[1]: kubelet.service: Consumed 851ms CPU time, 111.3M memory peak. Mar 10 01:58:19.081794 containerd[1555]: time="2026-03-10T01:58:19.081147686Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:19.085957 containerd[1555]: time="2026-03-10T01:58:19.085143273Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=25861770" Mar 10 01:58:19.091210 containerd[1555]: time="2026-03-10T01:58:19.091096885Z" level=info msg="ImageCreate event name:\"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:19.098158 containerd[1555]: time="2026-03-10T01:58:19.097831112Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:19.105052 containerd[1555]: time="2026-03-10T01:58:19.100069495Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"25860789\" in 8.01983832s" Mar 10 01:58:19.105052 containerd[1555]: time="2026-03-10T01:58:19.100147849Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\"" Mar 10 01:58:19.105734 containerd[1555]: time="2026-03-10T01:58:19.105700691Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 10 01:58:20.352014 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3053690812.mount: Deactivated successfully. Mar 10 01:58:26.661062 containerd[1555]: time="2026-03-10T01:58:26.658135262Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:26.673772 containerd[1555]: time="2026-03-10T01:58:26.659556636Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22388007" Mar 10 01:58:26.675663 containerd[1555]: time="2026-03-10T01:58:26.674610738Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:26.680285 containerd[1555]: time="2026-03-10T01:58:26.679202266Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:26.682533 containerd[1555]: time="2026-03-10T01:58:26.681878984Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 7.576052821s" Mar 10 01:58:26.682533 containerd[1555]: time="2026-03-10T01:58:26.681990072Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Mar 10 01:58:26.685412 containerd[1555]: time="2026-03-10T01:58:26.684935562Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 10 01:58:26.825234 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 10 01:58:26.842433 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 10 01:58:28.192807 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1231891256.mount: Deactivated successfully. Mar 10 01:58:28.288568 containerd[1555]: time="2026-03-10T01:58:28.287959769Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:28.293690 containerd[1555]: time="2026-03-10T01:58:28.293255584Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321218" Mar 10 01:58:28.298541 containerd[1555]: time="2026-03-10T01:58:28.298082623Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:28.305814 containerd[1555]: time="2026-03-10T01:58:28.305673617Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:28.307410 containerd[1555]: time="2026-03-10T01:58:28.306698146Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 1.621692871s" Mar 10 01:58:28.307410 containerd[1555]: time="2026-03-10T01:58:28.306781375Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Mar 10 01:58:28.318007 containerd[1555]: time="2026-03-10T01:58:28.312544053Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 10 01:58:28.453179 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:58:28.493671 (kubelet)[2235]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 10 01:58:29.843577 kubelet[2235]: E0310 01:58:29.837717 2235 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 10 01:58:29.849445 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 10 01:58:29.850664 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 10 01:58:29.857577 systemd[1]: kubelet.service: Consumed 1.907s CPU time, 112.4M memory peak. Mar 10 01:58:30.087920 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount454727359.mount: Deactivated successfully. Mar 10 01:58:38.601132 containerd[1555]: time="2026-03-10T01:58:38.599394296Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:38.604797 containerd[1555]: time="2026-03-10T01:58:38.604537553Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=22860674" Mar 10 01:58:38.620094 containerd[1555]: time="2026-03-10T01:58:38.619707322Z" level=info msg="ImageCreate event name:\"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:38.634021 containerd[1555]: time="2026-03-10T01:58:38.632805311Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:58:38.647993 containerd[1555]: time="2026-03-10T01:58:38.646634325Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"22871747\" in 10.334024136s" Mar 10 01:58:38.647993 containerd[1555]: time="2026-03-10T01:58:38.646779986Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\"" Mar 10 01:58:40.073335 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Mar 10 01:58:40.083744 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 10 01:58:40.762355 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:58:40.789411 (kubelet)[2338]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 10 01:58:41.248812 kubelet[2338]: E0310 01:58:41.248409 2338 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 10 01:58:41.260868 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 10 01:58:41.261155 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 10 01:58:41.261884 systemd[1]: kubelet.service: Consumed 824ms CPU time, 110.6M memory peak. Mar 10 01:58:45.771868 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:58:45.773734 systemd[1]: kubelet.service: Consumed 824ms CPU time, 110.6M memory peak. Mar 10 01:58:45.786019 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 10 01:58:45.848156 systemd[1]: Reload requested from client PID 2354 ('systemctl') (unit session-9.scope)... Mar 10 01:58:45.848278 systemd[1]: Reloading... Mar 10 01:58:46.060651 zram_generator::config[2397]: No configuration found. Mar 10 01:58:47.745098 systemd[1]: Reloading finished in 1895 ms. Mar 10 01:58:47.984828 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 10 01:58:47.985081 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 10 01:58:47.985747 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:58:47.985819 systemd[1]: kubelet.service: Consumed 1.352s CPU time, 98.1M memory peak. Mar 10 01:58:47.996600 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 10 01:58:48.409695 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:58:48.448347 (kubelet)[2444]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 10 01:58:49.025244 kubelet[2444]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 10 01:58:49.025244 kubelet[2444]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 01:58:49.025244 kubelet[2444]: I0310 01:58:49.025068 2444 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 01:58:51.041006 kubelet[2444]: I0310 01:58:51.040147 2444 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 10 01:58:51.041006 kubelet[2444]: I0310 01:58:51.040269 2444 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 01:58:51.041006 kubelet[2444]: I0310 01:58:51.040585 2444 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 10 01:58:51.041006 kubelet[2444]: I0310 01:58:51.040612 2444 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 10 01:58:51.043315 kubelet[2444]: I0310 01:58:51.043239 2444 server.go:956] "Client rotation is on, will bootstrap in background" Mar 10 01:58:51.088304 kubelet[2444]: E0310 01:58:51.087906 2444 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.74:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 10 01:58:51.092395 kubelet[2444]: I0310 01:58:51.091673 2444 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 10 01:58:51.124865 kubelet[2444]: I0310 01:58:51.124598 2444 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 01:58:51.140876 kubelet[2444]: I0310 01:58:51.139368 2444 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 10 01:58:51.142249 kubelet[2444]: I0310 01:58:51.141275 2444 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 01:58:51.142249 kubelet[2444]: I0310 01:58:51.141649 2444 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 01:58:51.143374 kubelet[2444]: I0310 01:58:51.143064 2444 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 01:58:51.144272 kubelet[2444]: I0310 01:58:51.143710 2444 container_manager_linux.go:306] "Creating device plugin manager" Mar 10 01:58:51.144272 kubelet[2444]: I0310 01:58:51.143907 2444 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 10 01:58:51.153501 kubelet[2444]: I0310 01:58:51.151097 2444 state_mem.go:36] "Initialized new in-memory state store" Mar 10 01:58:51.153501 kubelet[2444]: I0310 01:58:51.151762 2444 kubelet.go:475] "Attempting to sync node with API server" Mar 10 01:58:51.155169 kubelet[2444]: I0310 01:58:51.154120 2444 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 01:58:51.155169 kubelet[2444]: I0310 01:58:51.154241 2444 kubelet.go:387] "Adding apiserver pod source" Mar 10 01:58:51.155169 kubelet[2444]: I0310 01:58:51.154526 2444 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 01:58:51.163338 kubelet[2444]: E0310 01:58:51.163051 2444 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.74:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 10 01:58:51.165032 kubelet[2444]: E0310 01:58:51.161693 2444 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.74:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 10 01:58:51.173135 kubelet[2444]: I0310 01:58:51.172977 2444 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 10 01:58:51.174068 kubelet[2444]: I0310 01:58:51.173961 2444 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 10 01:58:51.174068 kubelet[2444]: I0310 01:58:51.174046 2444 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 10 01:58:51.177270 kubelet[2444]: W0310 01:58:51.174385 2444 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 10 01:58:52.193385 kubelet[2444]: I0310 01:58:52.193356 2444 server.go:1262] "Started kubelet" Mar 10 01:58:52.199402 kubelet[2444]: I0310 01:58:52.199260 2444 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 01:58:52.200026 kubelet[2444]: I0310 01:58:52.199637 2444 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 10 01:58:52.208004 kubelet[2444]: I0310 01:58:52.207722 2444 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 01:58:52.211735 kubelet[2444]: E0310 01:58:52.210183 2444 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.74:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.74:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.189b58451ecd6ef3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2026-03-10 01:58:52.193189619 +0000 UTC m=+3.706851930,LastTimestamp:2026-03-10 01:58:52.193189619 +0000 UTC m=+3.706851930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 10 01:58:52.211978 kubelet[2444]: I0310 01:58:52.211741 2444 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 01:58:52.212025 kubelet[2444]: I0310 01:58:52.211940 2444 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 01:58:52.213549 kubelet[2444]: I0310 01:58:52.213406 2444 server.go:310] "Adding debug handlers to kubelet server" Mar 10 01:58:52.216715 kubelet[2444]: I0310 01:58:52.214309 2444 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 10 01:58:52.216715 kubelet[2444]: I0310 01:58:52.216032 2444 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 10 01:58:52.216715 kubelet[2444]: E0310 01:58:52.216222 2444 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 10 01:58:52.218721 kubelet[2444]: I0310 01:58:52.218660 2444 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 01:58:52.220030 kubelet[2444]: I0310 01:58:52.218980 2444 reconciler.go:29] "Reconciler: start to sync state" Mar 10 01:58:52.220250 kubelet[2444]: E0310 01:58:52.220224 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.74:6443: connect: connection refused" interval="200ms" Mar 10 01:58:52.221564 kubelet[2444]: E0310 01:58:52.221531 2444 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.74:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 10 01:58:52.224145 kubelet[2444]: E0310 01:58:52.223426 2444 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.74:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 10 01:58:52.227686 kubelet[2444]: I0310 01:58:52.226144 2444 factory.go:223] Registration of the systemd container factory successfully Mar 10 01:58:52.227686 kubelet[2444]: I0310 01:58:52.226240 2444 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 10 01:58:52.231852 kubelet[2444]: E0310 01:58:52.231229 2444 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 10 01:58:52.232253 kubelet[2444]: I0310 01:58:52.232165 2444 factory.go:223] Registration of the containerd container factory successfully Mar 10 01:58:52.263233 kubelet[2444]: I0310 01:58:52.262601 2444 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 10 01:58:52.263233 kubelet[2444]: I0310 01:58:52.262653 2444 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 10 01:58:52.263233 kubelet[2444]: I0310 01:58:52.262679 2444 state_mem.go:36] "Initialized new in-memory state store" Mar 10 01:58:52.278815 kubelet[2444]: I0310 01:58:52.277790 2444 policy_none.go:49] "None policy: Start" Mar 10 01:58:52.278815 kubelet[2444]: I0310 01:58:52.278178 2444 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 10 01:58:52.278815 kubelet[2444]: I0310 01:58:52.278781 2444 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 10 01:58:52.284083 kubelet[2444]: I0310 01:58:52.283138 2444 policy_none.go:47] "Start" Mar 10 01:58:52.306676 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 10 01:58:52.319409 kubelet[2444]: E0310 01:58:52.316875 2444 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 10 01:58:52.355268 kubelet[2444]: I0310 01:58:52.353314 2444 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 10 01:58:52.379527 kubelet[2444]: E0310 01:58:52.376716 2444 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.74:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 10 01:58:52.379527 kubelet[2444]: I0310 01:58:52.377894 2444 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 10 01:58:52.379527 kubelet[2444]: I0310 01:58:52.378062 2444 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 10 01:58:52.379527 kubelet[2444]: I0310 01:58:52.378200 2444 kubelet.go:2428] "Starting kubelet main sync loop" Mar 10 01:58:52.380332 kubelet[2444]: E0310 01:58:52.380191 2444 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 01:58:52.380973 kubelet[2444]: E0310 01:58:52.380885 2444 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.74:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 10 01:58:52.384632 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 10 01:58:52.406106 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 10 01:58:52.421905 kubelet[2444]: E0310 01:58:52.420336 2444 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 10 01:58:52.441976 kubelet[2444]: E0310 01:58:52.440298 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.74:6443: connect: connection refused" interval="400ms" Mar 10 01:58:52.452833 kubelet[2444]: E0310 01:58:52.452240 2444 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 10 01:58:52.458315 kubelet[2444]: I0310 01:58:52.457300 2444 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 01:58:52.458315 kubelet[2444]: I0310 01:58:52.457354 2444 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 01:58:52.458614 kubelet[2444]: I0310 01:58:52.458550 2444 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 01:58:52.486575 kubelet[2444]: E0310 01:58:52.486260 2444 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 10 01:58:52.486575 kubelet[2444]: E0310 01:58:52.486402 2444 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 10 01:58:52.523873 kubelet[2444]: I0310 01:58:52.523360 2444 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/08e735c55be756ff8bb8189e244e30b0-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"08e735c55be756ff8bb8189e244e30b0\") " pod="kube-system/kube-apiserver-localhost" Mar 10 01:58:52.523873 kubelet[2444]: I0310 01:58:52.523832 2444 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 10 01:58:52.523873 kubelet[2444]: I0310 01:58:52.523911 2444 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 10 01:58:52.523873 kubelet[2444]: I0310 01:58:52.524129 2444 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 10 01:58:52.536783 kubelet[2444]: I0310 01:58:52.524325 2444 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 10 01:58:52.536783 kubelet[2444]: I0310 01:58:52.524541 2444 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 10 01:58:52.536783 kubelet[2444]: I0310 01:58:52.524652 2444 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/08e735c55be756ff8bb8189e244e30b0-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"08e735c55be756ff8bb8189e244e30b0\") " pod="kube-system/kube-apiserver-localhost" Mar 10 01:58:52.536783 kubelet[2444]: I0310 01:58:52.524815 2444 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/08e735c55be756ff8bb8189e244e30b0-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"08e735c55be756ff8bb8189e244e30b0\") " pod="kube-system/kube-apiserver-localhost" Mar 10 01:58:52.628592 kubelet[2444]: I0310 01:58:52.626435 2444 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/89efda49e166906783d8d868d41ebb86-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"89efda49e166906783d8d868d41ebb86\") " pod="kube-system/kube-scheduler-localhost" Mar 10 01:58:52.630039 kubelet[2444]: I0310 01:58:52.629863 2444 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 10 01:58:52.632099 kubelet[2444]: E0310 01:58:52.631727 2444 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.74:6443/api/v1/nodes\": dial tcp 10.0.0.74:6443: connect: connection refused" node="localhost" Mar 10 01:58:52.651676 systemd[1]: Created slice kubepods-burstable-pod08e735c55be756ff8bb8189e244e30b0.slice - libcontainer container kubepods-burstable-pod08e735c55be756ff8bb8189e244e30b0.slice. Mar 10 01:58:52.701299 kubelet[2444]: E0310 01:58:52.701217 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:58:52.720232 systemd[1]: Created slice kubepods-burstable-poddb0989cdb653dfec284dd4f35625e9e7.slice - libcontainer container kubepods-burstable-poddb0989cdb653dfec284dd4f35625e9e7.slice. Mar 10 01:58:52.724770 containerd[1555]: time="2026-03-10T01:58:52.722633862Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:08e735c55be756ff8bb8189e244e30b0,Namespace:kube-system,Attempt:0,}" Mar 10 01:58:52.751744 kubelet[2444]: E0310 01:58:52.751637 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:58:52.765340 systemd[1]: Created slice kubepods-burstable-pod89efda49e166906783d8d868d41ebb86.slice - libcontainer container kubepods-burstable-pod89efda49e166906783d8d868d41ebb86.slice. Mar 10 01:58:52.777262 containerd[1555]: time="2026-03-10T01:58:52.776250750Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:db0989cdb653dfec284dd4f35625e9e7,Namespace:kube-system,Attempt:0,}" Mar 10 01:58:52.813296 kubelet[2444]: E0310 01:58:52.813138 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:58:52.845712 kubelet[2444]: I0310 01:58:52.842564 2444 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 10 01:58:52.845712 kubelet[2444]: E0310 01:58:52.845002 2444 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.74:6443/api/v1/nodes\": dial tcp 10.0.0.74:6443: connect: connection refused" node="localhost" Mar 10 01:58:52.846926 containerd[1555]: time="2026-03-10T01:58:52.846334418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:89efda49e166906783d8d868d41ebb86,Namespace:kube-system,Attempt:0,}" Mar 10 01:58:52.865287 kubelet[2444]: E0310 01:58:52.864746 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.74:6443: connect: connection refused" interval="800ms" Mar 10 01:58:53.254970 kubelet[2444]: E0310 01:58:53.254604 2444 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.74:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 10 01:58:53.261074 kubelet[2444]: I0310 01:58:53.260952 2444 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 10 01:58:53.274426 kubelet[2444]: E0310 01:58:53.273144 2444 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.74:6443/api/v1/nodes\": dial tcp 10.0.0.74:6443: connect: connection refused" node="localhost" Mar 10 01:58:53.582217 kubelet[2444]: E0310 01:58:53.580823 2444 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.74:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 10 01:58:53.703900 kubelet[2444]: E0310 01:58:53.702421 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.74:6443: connect: connection refused" interval="1.6s" Mar 10 01:58:53.826406 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3070880056.mount: Deactivated successfully. Mar 10 01:58:53.897633 containerd[1555]: time="2026-03-10T01:58:53.895432657Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 10 01:58:53.903380 containerd[1555]: time="2026-03-10T01:58:53.903277588Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Mar 10 01:58:53.909870 kubelet[2444]: E0310 01:58:53.909761 2444 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.74:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 10 01:58:53.913661 containerd[1555]: time="2026-03-10T01:58:53.913228085Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 10 01:58:53.919920 containerd[1555]: time="2026-03-10T01:58:53.919322079Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 10 01:58:53.922985 containerd[1555]: time="2026-03-10T01:58:53.922839311Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 10 01:58:53.927167 containerd[1555]: time="2026-03-10T01:58:53.926314341Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 10 01:58:53.930534 containerd[1555]: time="2026-03-10T01:58:53.930375995Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 10 01:58:53.944840 containerd[1555]: time="2026-03-10T01:58:53.944231630Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 10 01:58:53.957200 containerd[1555]: time="2026-03-10T01:58:53.957038642Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.221979173s" Mar 10 01:58:53.960927 containerd[1555]: time="2026-03-10T01:58:53.960157731Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.091477487s" Mar 10 01:58:53.998914 containerd[1555]: time="2026-03-10T01:58:53.997838990Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.211428774s" Mar 10 01:58:54.114737 kubelet[2444]: I0310 01:58:54.084997 2444 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 10 01:58:54.114737 kubelet[2444]: E0310 01:58:54.086024 2444 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.74:6443/api/v1/nodes\": dial tcp 10.0.0.74:6443: connect: connection refused" node="localhost" Mar 10 01:58:54.114737 kubelet[2444]: E0310 01:58:54.107787 2444 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.74:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 10 01:58:54.155992 kubelet[2444]: E0310 01:58:54.155377 2444 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.74:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 10 01:58:54.629400 containerd[1555]: time="2026-03-10T01:58:54.629211401Z" level=info msg="connecting to shim 6265293fefd8ec4ede52e937f569c2634da75b6da0e5fe4d79c006d2f1de913b" address="unix:///run/containerd/s/43a09ad5e60ab68ad3b3572b261b56851e385c5e545441f6e8ede32eaacddaee" namespace=k8s.io protocol=ttrpc version=3 Mar 10 01:58:54.655526 containerd[1555]: time="2026-03-10T01:58:54.654665649Z" level=info msg="connecting to shim cf7f0c0955503763fe527a42a9d3b435c706951bfe0958401972d71ab517db59" address="unix:///run/containerd/s/2dce3ca54ec9c87229b06762f2a2ef1150160b88b05f5df8f9c40e346d72b329" namespace=k8s.io protocol=ttrpc version=3 Mar 10 01:58:54.690971 containerd[1555]: time="2026-03-10T01:58:54.690919360Z" level=info msg="connecting to shim 564936535e353ff643864d3414f9936851d3088f2b11c01dac01bb7e838e4525" address="unix:///run/containerd/s/fa2a2b8184b0f587b501165a4fcf3832aa7cfbbbe01b17a5fc0ee1c9998a2528" namespace=k8s.io protocol=ttrpc version=3 Mar 10 01:58:55.100045 systemd[1]: Started cri-containerd-564936535e353ff643864d3414f9936851d3088f2b11c01dac01bb7e838e4525.scope - libcontainer container 564936535e353ff643864d3414f9936851d3088f2b11c01dac01bb7e838e4525. Mar 10 01:58:55.109937 systemd[1]: Started cri-containerd-6265293fefd8ec4ede52e937f569c2634da75b6da0e5fe4d79c006d2f1de913b.scope - libcontainer container 6265293fefd8ec4ede52e937f569c2634da75b6da0e5fe4d79c006d2f1de913b. Mar 10 01:58:55.143385 systemd[1]: Started cri-containerd-cf7f0c0955503763fe527a42a9d3b435c706951bfe0958401972d71ab517db59.scope - libcontainer container cf7f0c0955503763fe527a42a9d3b435c706951bfe0958401972d71ab517db59. Mar 10 01:58:55.303679 containerd[1555]: time="2026-03-10T01:58:55.303604769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:89efda49e166906783d8d868d41ebb86,Namespace:kube-system,Attempt:0,} returns sandbox id \"564936535e353ff643864d3414f9936851d3088f2b11c01dac01bb7e838e4525\"" Mar 10 01:58:55.307522 kubelet[2444]: E0310 01:58:55.306770 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.74:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.74:6443: connect: connection refused" interval="3.2s" Mar 10 01:58:55.387380 containerd[1555]: time="2026-03-10T01:58:55.385152899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:db0989cdb653dfec284dd4f35625e9e7,Namespace:kube-system,Attempt:0,} returns sandbox id \"cf7f0c0955503763fe527a42a9d3b435c706951bfe0958401972d71ab517db59\"" Mar 10 01:58:55.399842 containerd[1555]: time="2026-03-10T01:58:55.399679306Z" level=info msg="CreateContainer within sandbox \"564936535e353ff643864d3414f9936851d3088f2b11c01dac01bb7e838e4525\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 10 01:58:55.404320 containerd[1555]: time="2026-03-10T01:58:55.402836974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:08e735c55be756ff8bb8189e244e30b0,Namespace:kube-system,Attempt:0,} returns sandbox id \"6265293fefd8ec4ede52e937f569c2634da75b6da0e5fe4d79c006d2f1de913b\"" Mar 10 01:58:55.404320 containerd[1555]: time="2026-03-10T01:58:55.404285727Z" level=info msg="CreateContainer within sandbox \"cf7f0c0955503763fe527a42a9d3b435c706951bfe0958401972d71ab517db59\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 10 01:58:55.442099 containerd[1555]: time="2026-03-10T01:58:55.440606553Z" level=info msg="CreateContainer within sandbox \"6265293fefd8ec4ede52e937f569c2634da75b6da0e5fe4d79c006d2f1de913b\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 10 01:58:55.494593 containerd[1555]: time="2026-03-10T01:58:55.491067459Z" level=info msg="Container 2a1199f80d2215f98eef4b67fa3141b339186fb05a99f71c090340200953b2b7: CDI devices from CRI Config.CDIDevices: []" Mar 10 01:58:55.513962 containerd[1555]: time="2026-03-10T01:58:55.513690270Z" level=info msg="Container 82a679beba372a44d5940cce62414106e12d476304ab09f74dfd29de7c7d608d: CDI devices from CRI Config.CDIDevices: []" Mar 10 01:58:55.559290 containerd[1555]: time="2026-03-10T01:58:55.559113649Z" level=info msg="CreateContainer within sandbox \"cf7f0c0955503763fe527a42a9d3b435c706951bfe0958401972d71ab517db59\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"2a1199f80d2215f98eef4b67fa3141b339186fb05a99f71c090340200953b2b7\"" Mar 10 01:58:55.559688 containerd[1555]: time="2026-03-10T01:58:55.559315211Z" level=info msg="Container d8677cf96fc993f059a6de7516d0bb68d2573036f338097d5e50f2ffb88bee1d: CDI devices from CRI Config.CDIDevices: []" Mar 10 01:58:55.577735 containerd[1555]: time="2026-03-10T01:58:55.576058619Z" level=info msg="StartContainer for \"2a1199f80d2215f98eef4b67fa3141b339186fb05a99f71c090340200953b2b7\"" Mar 10 01:58:55.589378 containerd[1555]: time="2026-03-10T01:58:55.589149910Z" level=info msg="connecting to shim 2a1199f80d2215f98eef4b67fa3141b339186fb05a99f71c090340200953b2b7" address="unix:///run/containerd/s/2dce3ca54ec9c87229b06762f2a2ef1150160b88b05f5df8f9c40e346d72b329" protocol=ttrpc version=3 Mar 10 01:58:55.603224 containerd[1555]: time="2026-03-10T01:58:55.603021140Z" level=info msg="CreateContainer within sandbox \"564936535e353ff643864d3414f9936851d3088f2b11c01dac01bb7e838e4525\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"82a679beba372a44d5940cce62414106e12d476304ab09f74dfd29de7c7d608d\"" Mar 10 01:58:55.606204 containerd[1555]: time="2026-03-10T01:58:55.606126881Z" level=info msg="StartContainer for \"82a679beba372a44d5940cce62414106e12d476304ab09f74dfd29de7c7d608d\"" Mar 10 01:58:55.609064 containerd[1555]: time="2026-03-10T01:58:55.609022664Z" level=info msg="connecting to shim 82a679beba372a44d5940cce62414106e12d476304ab09f74dfd29de7c7d608d" address="unix:///run/containerd/s/fa2a2b8184b0f587b501165a4fcf3832aa7cfbbbe01b17a5fc0ee1c9998a2528" protocol=ttrpc version=3 Mar 10 01:58:55.609178 containerd[1555]: time="2026-03-10T01:58:55.609083086Z" level=info msg="CreateContainer within sandbox \"6265293fefd8ec4ede52e937f569c2634da75b6da0e5fe4d79c006d2f1de913b\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d8677cf96fc993f059a6de7516d0bb68d2573036f338097d5e50f2ffb88bee1d\"" Mar 10 01:58:55.610984 containerd[1555]: time="2026-03-10T01:58:55.610789025Z" level=info msg="StartContainer for \"d8677cf96fc993f059a6de7516d0bb68d2573036f338097d5e50f2ffb88bee1d\"" Mar 10 01:58:55.612645 containerd[1555]: time="2026-03-10T01:58:55.612372012Z" level=info msg="connecting to shim d8677cf96fc993f059a6de7516d0bb68d2573036f338097d5e50f2ffb88bee1d" address="unix:///run/containerd/s/43a09ad5e60ab68ad3b3572b261b56851e385c5e545441f6e8ede32eaacddaee" protocol=ttrpc version=3 Mar 10 01:58:55.660587 systemd[1]: Started cri-containerd-2a1199f80d2215f98eef4b67fa3141b339186fb05a99f71c090340200953b2b7.scope - libcontainer container 2a1199f80d2215f98eef4b67fa3141b339186fb05a99f71c090340200953b2b7. Mar 10 01:58:55.690131 systemd[1]: Started cri-containerd-82a679beba372a44d5940cce62414106e12d476304ab09f74dfd29de7c7d608d.scope - libcontainer container 82a679beba372a44d5940cce62414106e12d476304ab09f74dfd29de7c7d608d. Mar 10 01:58:55.690848 kubelet[2444]: I0310 01:58:55.690764 2444 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 10 01:58:55.692692 kubelet[2444]: E0310 01:58:55.692401 2444 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.74:6443/api/v1/nodes\": dial tcp 10.0.0.74:6443: connect: connection refused" node="localhost" Mar 10 01:58:55.693144 systemd[1]: Started cri-containerd-d8677cf96fc993f059a6de7516d0bb68d2573036f338097d5e50f2ffb88bee1d.scope - libcontainer container d8677cf96fc993f059a6de7516d0bb68d2573036f338097d5e50f2ffb88bee1d. Mar 10 01:58:55.903639 containerd[1555]: time="2026-03-10T01:58:55.903538691Z" level=info msg="StartContainer for \"2a1199f80d2215f98eef4b67fa3141b339186fb05a99f71c090340200953b2b7\" returns successfully" Mar 10 01:58:55.908100 kubelet[2444]: E0310 01:58:55.906974 2444 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.74:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.74:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 10 01:58:55.925695 containerd[1555]: time="2026-03-10T01:58:55.924079690Z" level=info msg="StartContainer for \"d8677cf96fc993f059a6de7516d0bb68d2573036f338097d5e50f2ffb88bee1d\" returns successfully" Mar 10 01:58:55.968599 containerd[1555]: time="2026-03-10T01:58:55.966590837Z" level=info msg="StartContainer for \"82a679beba372a44d5940cce62414106e12d476304ab09f74dfd29de7c7d608d\" returns successfully" Mar 10 01:58:56.722306 kubelet[2444]: E0310 01:58:56.721445 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:58:56.759232 kubelet[2444]: E0310 01:58:56.757714 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:58:56.826646 kubelet[2444]: E0310 01:58:56.826256 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:58:57.809717 kubelet[2444]: E0310 01:58:57.808216 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:58:57.809717 kubelet[2444]: E0310 01:58:57.808908 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:58:57.809717 kubelet[2444]: E0310 01:58:57.809301 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:58:58.847813 kubelet[2444]: E0310 01:58:58.844201 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:58:58.847813 kubelet[2444]: E0310 01:58:58.846652 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:58:58.913415 kubelet[2444]: I0310 01:58:58.913043 2444 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 10 01:58:59.833326 kubelet[2444]: E0310 01:58:59.831685 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:59:02.489902 kubelet[2444]: E0310 01:59:02.489611 2444 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 10 01:59:03.243333 kubelet[2444]: E0310 01:59:03.242666 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:59:03.415612 kubelet[2444]: E0310 01:59:03.415001 2444 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 10 01:59:04.570724 kubelet[2444]: E0310 01:59:04.570177 2444 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 10 01:59:04.820095 kubelet[2444]: E0310 01:59:04.816691 2444 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{localhost.189b58451ecd6ef3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2026-03-10 01:58:52.193189619 +0000 UTC m=+3.706851930,LastTimestamp:2026-03-10 01:58:52.193189619 +0000 UTC m=+3.706851930,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 10 01:59:05.329183 kubelet[2444]: E0310 01:59:05.328933 2444 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{localhost.189b584521118f3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:InvalidDiskCapacity,Message:invalid capacity 0 on image filesystem,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2026-03-10 01:58:52.231208763 +0000 UTC m=+3.744871074,LastTimestamp:2026-03-10 01:58:52.231208763 +0000 UTC m=+3.744871074,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 10 01:59:05.333596 kubelet[2444]: I0310 01:59:05.333268 2444 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Mar 10 01:59:05.333596 kubelet[2444]: E0310 01:59:05.333349 2444 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Mar 10 01:59:05.335166 kubelet[2444]: I0310 01:59:05.335144 2444 apiserver.go:52] "Watching apiserver" Mar 10 01:59:05.418823 kubelet[2444]: I0310 01:59:05.418742 2444 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 01:59:05.420366 kubelet[2444]: I0310 01:59:05.419984 2444 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 10 01:59:05.492286 kubelet[2444]: E0310 01:59:05.491830 2444 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Mar 10 01:59:05.492286 kubelet[2444]: I0310 01:59:05.491870 2444 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 10 01:59:05.497963 kubelet[2444]: E0310 01:59:05.497920 2444 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Mar 10 01:59:05.497963 kubelet[2444]: I0310 01:59:05.497952 2444 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 10 01:59:05.503384 kubelet[2444]: E0310 01:59:05.503354 2444 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Mar 10 01:59:07.210008 kubelet[2444]: I0310 01:59:07.209689 2444 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 10 01:59:09.754766 kubelet[2444]: E0310 01:59:09.739417 2444 kubelet.go:2618] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.359s" Mar 10 01:59:12.913413 kubelet[2444]: I0310 01:59:12.913141 2444 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=5.913065666 podStartE2EDuration="5.913065666s" podCreationTimestamp="2026-03-10 01:59:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 01:59:12.912405829 +0000 UTC m=+24.426068159" watchObservedRunningTime="2026-03-10 01:59:12.913065666 +0000 UTC m=+24.426727977" Mar 10 01:59:13.398408 kubelet[2444]: I0310 01:59:13.397619 2444 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 10 01:59:21.148068 systemd[1]: Reload requested from client PID 2732 ('systemctl') (unit session-9.scope)... Mar 10 01:59:21.149789 systemd[1]: Reloading... Mar 10 01:59:21.761565 zram_generator::config[2778]: No configuration found. Mar 10 01:59:22.431882 kubelet[2444]: I0310 01:59:22.431393 2444 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=9.431374264 podStartE2EDuration="9.431374264s" podCreationTimestamp="2026-03-10 01:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 01:59:22.431189886 +0000 UTC m=+33.944852198" watchObservedRunningTime="2026-03-10 01:59:22.431374264 +0000 UTC m=+33.945036595" Mar 10 01:59:22.589135 systemd[1]: Reloading finished in 1438 ms. Mar 10 01:59:22.707539 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 10 01:59:22.735953 systemd[1]: kubelet.service: Deactivated successfully. Mar 10 01:59:22.736521 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:59:22.738192 systemd[1]: kubelet.service: Consumed 12.102s CPU time, 129.5M memory peak. Mar 10 01:59:22.757703 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 10 01:59:23.524882 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 10 01:59:23.548710 (kubelet)[2820]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 10 01:59:23.877615 kubelet[2820]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 10 01:59:23.878822 kubelet[2820]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 01:59:23.879028 kubelet[2820]: I0310 01:59:23.878980 2820 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 01:59:23.914413 kubelet[2820]: I0310 01:59:23.913795 2820 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 10 01:59:23.914413 kubelet[2820]: I0310 01:59:23.913827 2820 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 01:59:23.914673 kubelet[2820]: I0310 01:59:23.914600 2820 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 10 01:59:23.914673 kubelet[2820]: I0310 01:59:23.914636 2820 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 10 01:59:23.916549 kubelet[2820]: I0310 01:59:23.915708 2820 server.go:956] "Client rotation is on, will bootstrap in background" Mar 10 01:59:23.920718 kubelet[2820]: I0310 01:59:23.920565 2820 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 10 01:59:23.928530 kubelet[2820]: I0310 01:59:23.927833 2820 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 10 01:59:23.951607 kubelet[2820]: I0310 01:59:23.950099 2820 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 01:59:23.964277 kubelet[2820]: I0310 01:59:23.963566 2820 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 10 01:59:23.964277 kubelet[2820]: I0310 01:59:23.963878 2820 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 01:59:23.964277 kubelet[2820]: I0310 01:59:23.963919 2820 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 01:59:23.964277 kubelet[2820]: I0310 01:59:23.964146 2820 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 01:59:23.964748 kubelet[2820]: I0310 01:59:23.964161 2820 container_manager_linux.go:306] "Creating device plugin manager" Mar 10 01:59:23.969320 kubelet[2820]: I0310 01:59:23.964905 2820 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 10 01:59:23.969320 kubelet[2820]: I0310 01:59:23.966830 2820 state_mem.go:36] "Initialized new in-memory state store" Mar 10 01:59:23.969320 kubelet[2820]: I0310 01:59:23.968232 2820 kubelet.go:475] "Attempting to sync node with API server" Mar 10 01:59:23.969320 kubelet[2820]: I0310 01:59:23.968255 2820 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 01:59:23.969320 kubelet[2820]: I0310 01:59:23.968288 2820 kubelet.go:387] "Adding apiserver pod source" Mar 10 01:59:23.969320 kubelet[2820]: I0310 01:59:23.968397 2820 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 01:59:23.985525 kubelet[2820]: I0310 01:59:23.984414 2820 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 10 01:59:23.985525 kubelet[2820]: I0310 01:59:23.985145 2820 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 10 01:59:23.985525 kubelet[2820]: I0310 01:59:23.985220 2820 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 10 01:59:24.002073 kubelet[2820]: I0310 01:59:24.001602 2820 server.go:1262] "Started kubelet" Mar 10 01:59:24.004523 kubelet[2820]: I0310 01:59:24.003271 2820 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 01:59:24.008531 kubelet[2820]: I0310 01:59:24.007861 2820 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 10 01:59:24.008531 kubelet[2820]: I0310 01:59:24.008328 2820 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 01:59:24.008814 kubelet[2820]: I0310 01:59:24.008786 2820 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 01:59:24.124224 kubelet[2820]: I0310 01:59:24.123971 2820 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 10 01:59:24.126989 kubelet[2820]: I0310 01:59:24.126403 2820 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 01:59:24.132646 kubelet[2820]: I0310 01:59:24.128338 2820 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 10 01:59:24.132646 kubelet[2820]: I0310 01:59:24.130150 2820 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 01:59:24.136687 kubelet[2820]: I0310 01:59:24.135940 2820 reconciler.go:29] "Reconciler: start to sync state" Mar 10 01:59:24.144584 kubelet[2820]: I0310 01:59:24.144517 2820 factory.go:223] Registration of the systemd container factory successfully Mar 10 01:59:24.144820 kubelet[2820]: I0310 01:59:24.144750 2820 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 10 01:59:24.145028 kubelet[2820]: I0310 01:59:24.144907 2820 server.go:310] "Adding debug handlers to kubelet server" Mar 10 01:59:24.148161 kubelet[2820]: E0310 01:59:24.148137 2820 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 10 01:59:24.157393 kubelet[2820]: I0310 01:59:24.155592 2820 factory.go:223] Registration of the containerd container factory successfully Mar 10 01:59:24.187062 kubelet[2820]: I0310 01:59:24.186831 2820 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 10 01:59:24.220836 kubelet[2820]: I0310 01:59:24.220774 2820 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 10 01:59:24.220836 kubelet[2820]: I0310 01:59:24.220832 2820 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 10 01:59:24.221017 kubelet[2820]: I0310 01:59:24.220862 2820 kubelet.go:2428] "Starting kubelet main sync loop" Mar 10 01:59:24.221017 kubelet[2820]: E0310 01:59:24.220907 2820 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 01:59:24.279815 kubelet[2820]: I0310 01:59:24.279776 2820 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 10 01:59:24.280240 kubelet[2820]: I0310 01:59:24.280123 2820 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 10 01:59:24.280391 kubelet[2820]: I0310 01:59:24.280373 2820 state_mem.go:36] "Initialized new in-memory state store" Mar 10 01:59:24.281158 kubelet[2820]: I0310 01:59:24.281006 2820 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 10 01:59:24.281158 kubelet[2820]: I0310 01:59:24.281060 2820 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 10 01:59:24.281158 kubelet[2820]: I0310 01:59:24.281089 2820 policy_none.go:49] "None policy: Start" Mar 10 01:59:24.281158 kubelet[2820]: I0310 01:59:24.281102 2820 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 10 01:59:24.281158 kubelet[2820]: I0310 01:59:24.281117 2820 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 10 01:59:24.281741 kubelet[2820]: I0310 01:59:24.281288 2820 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 10 01:59:24.281741 kubelet[2820]: I0310 01:59:24.281303 2820 policy_none.go:47] "Start" Mar 10 01:59:24.296405 kubelet[2820]: E0310 01:59:24.294589 2820 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 10 01:59:24.296405 kubelet[2820]: I0310 01:59:24.294844 2820 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 01:59:24.296405 kubelet[2820]: I0310 01:59:24.294883 2820 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 01:59:24.296405 kubelet[2820]: I0310 01:59:24.295647 2820 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 01:59:24.300018 kubelet[2820]: E0310 01:59:24.299736 2820 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 10 01:59:24.321935 kubelet[2820]: I0310 01:59:24.321812 2820 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 10 01:59:24.322427 kubelet[2820]: I0310 01:59:24.322290 2820 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 10 01:59:24.323975 kubelet[2820]: I0310 01:59:24.323885 2820 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 10 01:59:24.340656 kubelet[2820]: I0310 01:59:24.340282 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/08e735c55be756ff8bb8189e244e30b0-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"08e735c55be756ff8bb8189e244e30b0\") " pod="kube-system/kube-apiserver-localhost" Mar 10 01:59:24.342523 kubelet[2820]: I0310 01:59:24.342404 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 10 01:59:24.342783 kubelet[2820]: I0310 01:59:24.342676 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 10 01:59:24.343080 kubelet[2820]: I0310 01:59:24.342926 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/89efda49e166906783d8d868d41ebb86-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"89efda49e166906783d8d868d41ebb86\") " pod="kube-system/kube-scheduler-localhost" Mar 10 01:59:24.343439 kubelet[2820]: I0310 01:59:24.343405 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/08e735c55be756ff8bb8189e244e30b0-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"08e735c55be756ff8bb8189e244e30b0\") " pod="kube-system/kube-apiserver-localhost" Mar 10 01:59:24.343608 kubelet[2820]: I0310 01:59:24.343592 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 10 01:59:24.343847 kubelet[2820]: I0310 01:59:24.343785 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 10 01:59:24.343985 kubelet[2820]: I0310 01:59:24.343969 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/db0989cdb653dfec284dd4f35625e9e7-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"db0989cdb653dfec284dd4f35625e9e7\") " pod="kube-system/kube-controller-manager-localhost" Mar 10 01:59:24.344295 kubelet[2820]: I0310 01:59:24.344267 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/08e735c55be756ff8bb8189e244e30b0-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"08e735c55be756ff8bb8189e244e30b0\") " pod="kube-system/kube-apiserver-localhost" Mar 10 01:59:24.347962 kubelet[2820]: E0310 01:59:24.347819 2820 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Mar 10 01:59:24.353317 kubelet[2820]: E0310 01:59:24.352986 2820 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Mar 10 01:59:24.532113 kubelet[2820]: I0310 01:59:24.529059 2820 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Mar 10 01:59:24.638540 kubelet[2820]: I0310 01:59:24.638091 2820 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Mar 10 01:59:24.639586 kubelet[2820]: I0310 01:59:24.639565 2820 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Mar 10 01:59:25.058534 kubelet[2820]: I0310 01:59:25.056740 2820 apiserver.go:52] "Watching apiserver" Mar 10 01:59:25.134172 kubelet[2820]: I0310 01:59:25.132569 2820 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 01:59:25.255531 kubelet[2820]: I0310 01:59:25.251842 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.251800137 podStartE2EDuration="1.251800137s" podCreationTimestamp="2026-03-10 01:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 01:59:25.246543735 +0000 UTC m=+1.685157636" watchObservedRunningTime="2026-03-10 01:59:25.251800137 +0000 UTC m=+1.690414039" Mar 10 01:59:25.264961 kubelet[2820]: I0310 01:59:25.262980 2820 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 10 01:59:25.764996 kubelet[2820]: E0310 01:59:25.764731 2820 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 10 01:59:26.436722 kubelet[2820]: I0310 01:59:26.436337 2820 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 10 01:59:26.438028 containerd[1555]: time="2026-03-10T01:59:26.437871027Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 10 01:59:26.439099 kubelet[2820]: I0310 01:59:26.439048 2820 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 10 01:59:26.791551 systemd[1]: Created slice kubepods-besteffort-pod33bf4ad6_30bc_4aeb_80e5_7f4cc65b0001.slice - libcontainer container kubepods-besteffort-pod33bf4ad6_30bc_4aeb_80e5_7f4cc65b0001.slice. Mar 10 01:59:27.100065 kubelet[2820]: I0310 01:59:27.096871 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/33bf4ad6-30bc-4aeb-80e5-7f4cc65b0001-xtables-lock\") pod \"kube-proxy-b2gtl\" (UID: \"33bf4ad6-30bc-4aeb-80e5-7f4cc65b0001\") " pod="kube-system/kube-proxy-b2gtl" Mar 10 01:59:27.101512 kubelet[2820]: I0310 01:59:27.100877 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2bfw\" (UniqueName: \"kubernetes.io/projected/33bf4ad6-30bc-4aeb-80e5-7f4cc65b0001-kube-api-access-f2bfw\") pod \"kube-proxy-b2gtl\" (UID: \"33bf4ad6-30bc-4aeb-80e5-7f4cc65b0001\") " pod="kube-system/kube-proxy-b2gtl" Mar 10 01:59:27.105941 kubelet[2820]: I0310 01:59:27.101803 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33bf4ad6-30bc-4aeb-80e5-7f4cc65b0001-lib-modules\") pod \"kube-proxy-b2gtl\" (UID: \"33bf4ad6-30bc-4aeb-80e5-7f4cc65b0001\") " pod="kube-system/kube-proxy-b2gtl" Mar 10 01:59:27.105941 kubelet[2820]: I0310 01:59:27.101856 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/33bf4ad6-30bc-4aeb-80e5-7f4cc65b0001-kube-proxy\") pod \"kube-proxy-b2gtl\" (UID: \"33bf4ad6-30bc-4aeb-80e5-7f4cc65b0001\") " pod="kube-system/kube-proxy-b2gtl" Mar 10 01:59:28.026599 containerd[1555]: time="2026-03-10T01:59:28.021845848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-b2gtl,Uid:33bf4ad6-30bc-4aeb-80e5-7f4cc65b0001,Namespace:kube-system,Attempt:0,}" Mar 10 01:59:28.431954 containerd[1555]: time="2026-03-10T01:59:28.424186570Z" level=info msg="connecting to shim b5351794f7922dad00c59a201d42fb99c9cc7c588d1e54490f74f20233332f0d" address="unix:///run/containerd/s/2b1a5fe98b0c1f7c0c22dd074878b4cae45d9a1606900b7d16c85a9a4b22712a" namespace=k8s.io protocol=ttrpc version=3 Mar 10 01:59:29.518835 systemd[1]: Started cri-containerd-b5351794f7922dad00c59a201d42fb99c9cc7c588d1e54490f74f20233332f0d.scope - libcontainer container b5351794f7922dad00c59a201d42fb99c9cc7c588d1e54490f74f20233332f0d. Mar 10 01:59:30.196292 containerd[1555]: time="2026-03-10T01:59:30.195361088Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-b2gtl,Uid:33bf4ad6-30bc-4aeb-80e5-7f4cc65b0001,Namespace:kube-system,Attempt:0,} returns sandbox id \"b5351794f7922dad00c59a201d42fb99c9cc7c588d1e54490f74f20233332f0d\"" Mar 10 01:59:30.357226 containerd[1555]: time="2026-03-10T01:59:30.357121992Z" level=info msg="CreateContainer within sandbox \"b5351794f7922dad00c59a201d42fb99c9cc7c588d1e54490f74f20233332f0d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 10 01:59:30.838661 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2300205791.mount: Deactivated successfully. Mar 10 01:59:30.903483 containerd[1555]: time="2026-03-10T01:59:30.900921360Z" level=info msg="Container b9bf8657e4302dca28021ef5dcf078a54db24ffe8555590cab115730cc07f763: CDI devices from CRI Config.CDIDevices: []" Mar 10 01:59:31.115133 containerd[1555]: time="2026-03-10T01:59:31.111348413Z" level=info msg="CreateContainer within sandbox \"b5351794f7922dad00c59a201d42fb99c9cc7c588d1e54490f74f20233332f0d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b9bf8657e4302dca28021ef5dcf078a54db24ffe8555590cab115730cc07f763\"" Mar 10 01:59:31.115133 containerd[1555]: time="2026-03-10T01:59:31.113189881Z" level=info msg="StartContainer for \"b9bf8657e4302dca28021ef5dcf078a54db24ffe8555590cab115730cc07f763\"" Mar 10 01:59:31.133484 containerd[1555]: time="2026-03-10T01:59:31.130988157Z" level=info msg="connecting to shim b9bf8657e4302dca28021ef5dcf078a54db24ffe8555590cab115730cc07f763" address="unix:///run/containerd/s/2b1a5fe98b0c1f7c0c22dd074878b4cae45d9a1606900b7d16c85a9a4b22712a" protocol=ttrpc version=3 Mar 10 01:59:31.343290 systemd[1]: Started cri-containerd-b9bf8657e4302dca28021ef5dcf078a54db24ffe8555590cab115730cc07f763.scope - libcontainer container b9bf8657e4302dca28021ef5dcf078a54db24ffe8555590cab115730cc07f763. Mar 10 01:59:31.783577 systemd[1]: Created slice kubepods-besteffort-poda584eb4f_a1f1_4dae_9783_dc9ed628de81.slice - libcontainer container kubepods-besteffort-poda584eb4f_a1f1_4dae_9783_dc9ed628de81.slice. Mar 10 01:59:31.843966 kubelet[2820]: I0310 01:59:31.840434 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a584eb4f-a1f1-4dae-9783-dc9ed628de81-var-lib-calico\") pod \"tigera-operator-5588576f44-v2b89\" (UID: \"a584eb4f-a1f1-4dae-9783-dc9ed628de81\") " pod="tigera-operator/tigera-operator-5588576f44-v2b89" Mar 10 01:59:31.845280 kubelet[2820]: I0310 01:59:31.844844 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxgrt\" (UniqueName: \"kubernetes.io/projected/a584eb4f-a1f1-4dae-9783-dc9ed628de81-kube-api-access-mxgrt\") pod \"tigera-operator-5588576f44-v2b89\" (UID: \"a584eb4f-a1f1-4dae-9783-dc9ed628de81\") " pod="tigera-operator/tigera-operator-5588576f44-v2b89" Mar 10 01:59:32.278612 containerd[1555]: time="2026-03-10T01:59:32.278376373Z" level=info msg="StartContainer for \"b9bf8657e4302dca28021ef5dcf078a54db24ffe8555590cab115730cc07f763\" returns successfully" Mar 10 01:59:32.416544 containerd[1555]: time="2026-03-10T01:59:32.413932434Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-v2b89,Uid:a584eb4f-a1f1-4dae-9783-dc9ed628de81,Namespace:tigera-operator,Attempt:0,}" Mar 10 01:59:32.793623 containerd[1555]: time="2026-03-10T01:59:32.792184576Z" level=info msg="connecting to shim 19f257c85cd2e21b18347a0434c9360cc1edbecba0fa88ebb5cf7a0ac3e4352a" address="unix:///run/containerd/s/11cd17bd80179e8f4e99f842c3ddcdb9b322bb659ed9493cafcbd490dc9cba33" namespace=k8s.io protocol=ttrpc version=3 Mar 10 01:59:32.896101 systemd[1]: Started cri-containerd-19f257c85cd2e21b18347a0434c9360cc1edbecba0fa88ebb5cf7a0ac3e4352a.scope - libcontainer container 19f257c85cd2e21b18347a0434c9360cc1edbecba0fa88ebb5cf7a0ac3e4352a. Mar 10 01:59:33.449694 containerd[1555]: time="2026-03-10T01:59:33.448205249Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-v2b89,Uid:a584eb4f-a1f1-4dae-9783-dc9ed628de81,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"19f257c85cd2e21b18347a0434c9360cc1edbecba0fa88ebb5cf7a0ac3e4352a\"" Mar 10 01:59:33.456406 containerd[1555]: time="2026-03-10T01:59:33.456092062Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 10 01:59:33.977116 kubelet[2820]: I0310 01:59:33.976632 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-b2gtl" podStartSLOduration=7.976407052 podStartE2EDuration="7.976407052s" podCreationTimestamp="2026-03-10 01:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 01:59:32.94495916 +0000 UTC m=+9.383573060" watchObservedRunningTime="2026-03-10 01:59:33.976407052 +0000 UTC m=+10.415020973" Mar 10 01:59:35.431173 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1356073521.mount: Deactivated successfully. Mar 10 01:59:49.213599 containerd[1555]: time="2026-03-10T01:59:49.210828514Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:59:49.213599 containerd[1555]: time="2026-03-10T01:59:49.211704054Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 10 01:59:49.218712 containerd[1555]: time="2026-03-10T01:59:49.214380347Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:59:49.225936 containerd[1555]: time="2026-03-10T01:59:49.224305428Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 01:59:49.225936 containerd[1555]: time="2026-03-10T01:59:49.224921239Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 15.768571037s" Mar 10 01:59:49.225936 containerd[1555]: time="2026-03-10T01:59:49.224954661Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 10 01:59:49.247727 containerd[1555]: time="2026-03-10T01:59:49.246779105Z" level=info msg="CreateContainer within sandbox \"19f257c85cd2e21b18347a0434c9360cc1edbecba0fa88ebb5cf7a0ac3e4352a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 10 01:59:49.307121 containerd[1555]: time="2026-03-10T01:59:49.304747667Z" level=info msg="Container 85e8362e42561029c9609bad251c667d40a02182eeabc5fbe736273649598a58: CDI devices from CRI Config.CDIDevices: []" Mar 10 01:59:49.332712 containerd[1555]: time="2026-03-10T01:59:49.332654648Z" level=info msg="CreateContainer within sandbox \"19f257c85cd2e21b18347a0434c9360cc1edbecba0fa88ebb5cf7a0ac3e4352a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"85e8362e42561029c9609bad251c667d40a02182eeabc5fbe736273649598a58\"" Mar 10 01:59:49.338265 containerd[1555]: time="2026-03-10T01:59:49.333600410Z" level=info msg="StartContainer for \"85e8362e42561029c9609bad251c667d40a02182eeabc5fbe736273649598a58\"" Mar 10 01:59:49.338265 containerd[1555]: time="2026-03-10T01:59:49.334922881Z" level=info msg="connecting to shim 85e8362e42561029c9609bad251c667d40a02182eeabc5fbe736273649598a58" address="unix:///run/containerd/s/11cd17bd80179e8f4e99f842c3ddcdb9b322bb659ed9493cafcbd490dc9cba33" protocol=ttrpc version=3 Mar 10 01:59:49.617343 systemd[1]: Started cri-containerd-85e8362e42561029c9609bad251c667d40a02182eeabc5fbe736273649598a58.scope - libcontainer container 85e8362e42561029c9609bad251c667d40a02182eeabc5fbe736273649598a58. Mar 10 01:59:49.931137 containerd[1555]: time="2026-03-10T01:59:49.929316950Z" level=info msg="StartContainer for \"85e8362e42561029c9609bad251c667d40a02182eeabc5fbe736273649598a58\" returns successfully" Mar 10 01:59:50.406329 kubelet[2820]: I0310 01:59:50.404919 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-v2b89" podStartSLOduration=3.627958822 podStartE2EDuration="19.403076783s" podCreationTimestamp="2026-03-10 01:59:31 +0000 UTC" firstStartedPulling="2026-03-10 01:59:33.453250891 +0000 UTC m=+9.891864793" lastFinishedPulling="2026-03-10 01:59:49.228368843 +0000 UTC m=+25.666982754" observedRunningTime="2026-03-10 01:59:50.39876916 +0000 UTC m=+26.837383061" watchObservedRunningTime="2026-03-10 01:59:50.403076783 +0000 UTC m=+26.841690683" Mar 10 02:00:04.342010 sudo[1777]: pam_unix(sudo:session): session closed for user root Mar 10 02:00:04.349663 sshd[1776]: Connection closed by 10.0.0.1 port 39652 Mar 10 02:00:04.351360 sshd-session[1773]: pam_unix(sshd:session): session closed for user core Mar 10 02:00:04.363942 systemd[1]: sshd@8-10.0.0.74:22-10.0.0.1:39652.service: Deactivated successfully. Mar 10 02:00:04.378899 systemd[1]: session-9.scope: Deactivated successfully. Mar 10 02:00:04.379400 systemd[1]: session-9.scope: Consumed 14.499s CPU time, 235.7M memory peak. Mar 10 02:00:04.394511 systemd-logind[1537]: Session 9 logged out. Waiting for processes to exit. Mar 10 02:00:04.406195 systemd-logind[1537]: Removed session 9. Mar 10 02:00:16.922768 kubelet[2820]: I0310 02:00:16.922672 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7k89\" (UniqueName: \"kubernetes.io/projected/36a6f105-a54d-4d22-b56b-b9af41f8a6c3-kube-api-access-w7k89\") pod \"calico-typha-86c7595d64-twpmx\" (UID: \"36a6f105-a54d-4d22-b56b-b9af41f8a6c3\") " pod="calico-system/calico-typha-86c7595d64-twpmx" Mar 10 02:00:16.922768 kubelet[2820]: I0310 02:00:16.922734 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36a6f105-a54d-4d22-b56b-b9af41f8a6c3-tigera-ca-bundle\") pod \"calico-typha-86c7595d64-twpmx\" (UID: \"36a6f105-a54d-4d22-b56b-b9af41f8a6c3\") " pod="calico-system/calico-typha-86c7595d64-twpmx" Mar 10 02:00:16.922768 kubelet[2820]: I0310 02:00:16.922762 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/36a6f105-a54d-4d22-b56b-b9af41f8a6c3-typha-certs\") pod \"calico-typha-86c7595d64-twpmx\" (UID: \"36a6f105-a54d-4d22-b56b-b9af41f8a6c3\") " pod="calico-system/calico-typha-86c7595d64-twpmx" Mar 10 02:00:16.925153 systemd[1]: Created slice kubepods-besteffort-pod36a6f105_a54d_4d22_b56b_b9af41f8a6c3.slice - libcontainer container kubepods-besteffort-pod36a6f105_a54d_4d22_b56b_b9af41f8a6c3.slice. Mar 10 02:00:17.264306 containerd[1555]: time="2026-03-10T02:00:17.263636270Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-86c7595d64-twpmx,Uid:36a6f105-a54d-4d22-b56b-b9af41f8a6c3,Namespace:calico-system,Attempt:0,}" Mar 10 02:00:17.371611 systemd[1]: Created slice kubepods-besteffort-pod5e68d9a1_96e4_41fa_997f_ef2ba4f3ece8.slice - libcontainer container kubepods-besteffort-pod5e68d9a1_96e4_41fa_997f_ef2ba4f3ece8.slice. Mar 10 02:00:17.428357 kubelet[2820]: I0310 02:00:17.427836 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-bpffs\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.429814 kubelet[2820]: I0310 02:00:17.429670 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-lib-modules\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.432521 kubelet[2820]: I0310 02:00:17.432074 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-cni-bin-dir\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.432521 kubelet[2820]: I0310 02:00:17.432169 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-cni-net-dir\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.432521 kubelet[2820]: I0310 02:00:17.432192 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-node-certs\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.432521 kubelet[2820]: I0310 02:00:17.432214 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-sys-fs\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.432521 kubelet[2820]: I0310 02:00:17.432232 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-var-lib-calico\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.434040 kubelet[2820]: I0310 02:00:17.433736 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-nodeproc\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.434040 kubelet[2820]: I0310 02:00:17.433772 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-policysync\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.434040 kubelet[2820]: I0310 02:00:17.433794 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-var-run-calico\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.434040 kubelet[2820]: I0310 02:00:17.433815 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-xtables-lock\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.434040 kubelet[2820]: I0310 02:00:17.433837 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmksw\" (UniqueName: \"kubernetes.io/projected/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-kube-api-access-kmksw\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.434310 kubelet[2820]: I0310 02:00:17.433864 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-cni-log-dir\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.434310 kubelet[2820]: I0310 02:00:17.433886 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-flexvol-driver-host\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.434310 kubelet[2820]: I0310 02:00:17.433957 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8-tigera-ca-bundle\") pod \"calico-node-7kz5z\" (UID: \"5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8\") " pod="calico-system/calico-node-7kz5z" Mar 10 02:00:17.448422 containerd[1555]: time="2026-03-10T02:00:17.448190677Z" level=info msg="connecting to shim 7474e12bc0975db289b41c972036c596a721b395105355f873c63b1cd199806d" address="unix:///run/containerd/s/72abe40d8657ac6e5e1436ea28a23f2a9335c11d22ce4baf0e4c9c2e9b404997" namespace=k8s.io protocol=ttrpc version=3 Mar 10 02:00:17.667195 kubelet[2820]: E0310 02:00:17.667160 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.670161 kubelet[2820]: W0310 02:00:17.667285 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.670161 kubelet[2820]: E0310 02:00:17.667362 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.679003 kubelet[2820]: E0310 02:00:17.678868 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.679003 kubelet[2820]: W0310 02:00:17.678921 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.679003 kubelet[2820]: E0310 02:00:17.678946 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.679242 kubelet[2820]: E0310 02:00:17.679046 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:17.747944 containerd[1555]: time="2026-03-10T02:00:17.746587870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7kz5z,Uid:5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8,Namespace:calico-system,Attempt:0,}" Mar 10 02:00:17.763905 kubelet[2820]: E0310 02:00:17.763139 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.763905 kubelet[2820]: W0310 02:00:17.763171 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.763905 kubelet[2820]: E0310 02:00:17.763317 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.769272 kubelet[2820]: E0310 02:00:17.766256 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.769272 kubelet[2820]: W0310 02:00:17.766276 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.769272 kubelet[2820]: E0310 02:00:17.766300 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.769272 kubelet[2820]: E0310 02:00:17.767565 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.769272 kubelet[2820]: W0310 02:00:17.767578 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.769272 kubelet[2820]: E0310 02:00:17.767596 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.769272 kubelet[2820]: E0310 02:00:17.768538 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.769272 kubelet[2820]: W0310 02:00:17.768550 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.769272 kubelet[2820]: E0310 02:00:17.768564 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.772229 kubelet[2820]: E0310 02:00:17.770804 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.772229 kubelet[2820]: W0310 02:00:17.771779 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.776209 kubelet[2820]: E0310 02:00:17.772958 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.776209 kubelet[2820]: E0310 02:00:17.775074 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.776209 kubelet[2820]: W0310 02:00:17.775087 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.776209 kubelet[2820]: E0310 02:00:17.775213 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.777908 kubelet[2820]: E0310 02:00:17.777286 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.777908 kubelet[2820]: W0310 02:00:17.777424 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.777908 kubelet[2820]: E0310 02:00:17.777444 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.780750 kubelet[2820]: E0310 02:00:17.779748 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.780750 kubelet[2820]: W0310 02:00:17.779869 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.780750 kubelet[2820]: E0310 02:00:17.779887 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.783039 kubelet[2820]: E0310 02:00:17.782408 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.783039 kubelet[2820]: W0310 02:00:17.782552 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.783039 kubelet[2820]: E0310 02:00:17.782567 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.785518 kubelet[2820]: E0310 02:00:17.784876 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.785518 kubelet[2820]: W0310 02:00:17.785255 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.787434 kubelet[2820]: E0310 02:00:17.786527 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.790020 kubelet[2820]: E0310 02:00:17.788953 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.790020 kubelet[2820]: W0310 02:00:17.789203 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.791144 kubelet[2820]: E0310 02:00:17.790643 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.791970 kubelet[2820]: E0310 02:00:17.791726 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.792375 kubelet[2820]: W0310 02:00:17.792252 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.792824 kubelet[2820]: E0310 02:00:17.792536 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.794085 kubelet[2820]: E0310 02:00:17.793410 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.794085 kubelet[2820]: W0310 02:00:17.793428 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.794527 kubelet[2820]: E0310 02:00:17.793442 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.796528 kubelet[2820]: E0310 02:00:17.796509 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.796613 kubelet[2820]: W0310 02:00:17.796599 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.796674 kubelet[2820]: E0310 02:00:17.796662 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.798332 kubelet[2820]: E0310 02:00:17.798315 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.798427 kubelet[2820]: W0310 02:00:17.798410 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.798579 kubelet[2820]: E0310 02:00:17.798563 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.800908 kubelet[2820]: E0310 02:00:17.800891 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.801081 kubelet[2820]: W0310 02:00:17.801064 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.801212 kubelet[2820]: E0310 02:00:17.801197 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.802247 kubelet[2820]: E0310 02:00:17.802231 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.802334 kubelet[2820]: W0310 02:00:17.802318 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.802404 kubelet[2820]: E0310 02:00:17.802390 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.803577 kubelet[2820]: E0310 02:00:17.803332 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.803577 kubelet[2820]: W0310 02:00:17.803347 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.803577 kubelet[2820]: E0310 02:00:17.803360 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.805169 kubelet[2820]: E0310 02:00:17.805149 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.805265 kubelet[2820]: W0310 02:00:17.805248 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.805345 kubelet[2820]: E0310 02:00:17.805329 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.806685 kubelet[2820]: E0310 02:00:17.806244 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.806685 kubelet[2820]: W0310 02:00:17.806428 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.809274 kubelet[2820]: E0310 02:00:17.807085 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.823332 kubelet[2820]: E0310 02:00:17.821073 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.823332 kubelet[2820]: W0310 02:00:17.821311 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.823332 kubelet[2820]: E0310 02:00:17.821339 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.823332 kubelet[2820]: I0310 02:00:17.821376 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17109937-e605-4060-a474-8701d718389b-registration-dir\") pod \"csi-node-driver-9jkrd\" (UID: \"17109937-e605-4060-a474-8701d718389b\") " pod="calico-system/csi-node-driver-9jkrd" Mar 10 02:00:17.823332 kubelet[2820]: E0310 02:00:17.822528 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.823332 kubelet[2820]: W0310 02:00:17.822544 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.823332 kubelet[2820]: E0310 02:00:17.822561 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.821224 systemd[1]: Started cri-containerd-7474e12bc0975db289b41c972036c596a721b395105355f873c63b1cd199806d.scope - libcontainer container 7474e12bc0975db289b41c972036c596a721b395105355f873c63b1cd199806d. Mar 10 02:00:17.824794 kubelet[2820]: I0310 02:00:17.822628 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vllt\" (UniqueName: \"kubernetes.io/projected/17109937-e605-4060-a474-8701d718389b-kube-api-access-6vllt\") pod \"csi-node-driver-9jkrd\" (UID: \"17109937-e605-4060-a474-8701d718389b\") " pod="calico-system/csi-node-driver-9jkrd" Mar 10 02:00:17.835228 kubelet[2820]: E0310 02:00:17.834614 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.835228 kubelet[2820]: W0310 02:00:17.834742 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.835228 kubelet[2820]: E0310 02:00:17.834772 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.836597 kubelet[2820]: E0310 02:00:17.836272 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.836597 kubelet[2820]: W0310 02:00:17.836412 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.836597 kubelet[2820]: E0310 02:00:17.836434 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.838832 kubelet[2820]: E0310 02:00:17.838649 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.840803 kubelet[2820]: W0310 02:00:17.838685 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.840803 kubelet[2820]: E0310 02:00:17.839423 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.846858 kubelet[2820]: E0310 02:00:17.846389 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.848529 kubelet[2820]: W0310 02:00:17.847710 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.848529 kubelet[2820]: E0310 02:00:17.847742 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.848529 kubelet[2820]: I0310 02:00:17.847778 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17109937-e605-4060-a474-8701d718389b-socket-dir\") pod \"csi-node-driver-9jkrd\" (UID: \"17109937-e605-4060-a474-8701d718389b\") " pod="calico-system/csi-node-driver-9jkrd" Mar 10 02:00:17.849805 kubelet[2820]: E0310 02:00:17.849677 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.850892 kubelet[2820]: W0310 02:00:17.850575 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.850892 kubelet[2820]: E0310 02:00:17.850608 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.853008 kubelet[2820]: E0310 02:00:17.852860 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.853554 kubelet[2820]: W0310 02:00:17.853533 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.854518 kubelet[2820]: E0310 02:00:17.853794 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.855000 kubelet[2820]: E0310 02:00:17.854986 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.855058 kubelet[2820]: W0310 02:00:17.855047 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.855552 kubelet[2820]: E0310 02:00:17.855529 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.859659 kubelet[2820]: E0310 02:00:17.859611 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.859724 kubelet[2820]: W0310 02:00:17.859660 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.859724 kubelet[2820]: E0310 02:00:17.859682 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.859818 kubelet[2820]: I0310 02:00:17.859723 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/17109937-e605-4060-a474-8701d718389b-varrun\") pod \"csi-node-driver-9jkrd\" (UID: \"17109937-e605-4060-a474-8701d718389b\") " pod="calico-system/csi-node-driver-9jkrd" Mar 10 02:00:17.862302 kubelet[2820]: E0310 02:00:17.861414 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.862302 kubelet[2820]: W0310 02:00:17.861591 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.862302 kubelet[2820]: E0310 02:00:17.861608 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.862302 kubelet[2820]: I0310 02:00:17.861635 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17109937-e605-4060-a474-8701d718389b-kubelet-dir\") pod \"csi-node-driver-9jkrd\" (UID: \"17109937-e605-4060-a474-8701d718389b\") " pod="calico-system/csi-node-driver-9jkrd" Mar 10 02:00:17.865594 kubelet[2820]: E0310 02:00:17.865561 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.865850 kubelet[2820]: W0310 02:00:17.865829 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.865979 kubelet[2820]: E0310 02:00:17.865959 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.867425 kubelet[2820]: E0310 02:00:17.867409 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.867706 kubelet[2820]: W0310 02:00:17.867589 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.867706 kubelet[2820]: E0310 02:00:17.867611 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.871785 kubelet[2820]: E0310 02:00:17.871568 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.871785 kubelet[2820]: W0310 02:00:17.871584 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.871785 kubelet[2820]: E0310 02:00:17.871600 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.872516 kubelet[2820]: E0310 02:00:17.872419 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.872516 kubelet[2820]: W0310 02:00:17.872433 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.872637 kubelet[2820]: E0310 02:00:17.872617 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.879398 containerd[1555]: time="2026-03-10T02:00:17.879201584Z" level=info msg="connecting to shim 6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5" address="unix:///run/containerd/s/c50a6d0640db43244fb30d17e478f945a7722d1644a1e93a0435271b137642bf" namespace=k8s.io protocol=ttrpc version=3 Mar 10 02:00:17.970967 kubelet[2820]: E0310 02:00:17.966406 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.970967 kubelet[2820]: W0310 02:00:17.966430 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.970967 kubelet[2820]: E0310 02:00:17.966516 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:17.970967 kubelet[2820]: E0310 02:00:17.969556 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:17.970967 kubelet[2820]: W0310 02:00:17.969571 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:17.970967 kubelet[2820]: E0310 02:00:17.969591 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.018885 kubelet[2820]: E0310 02:00:18.016407 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.018885 kubelet[2820]: W0310 02:00:18.016584 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.018885 kubelet[2820]: E0310 02:00:18.016702 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.020707 kubelet[2820]: E0310 02:00:18.020063 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.020707 kubelet[2820]: W0310 02:00:18.020083 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.020707 kubelet[2820]: E0310 02:00:18.020147 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.020959 kubelet[2820]: E0310 02:00:18.020860 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.020959 kubelet[2820]: W0310 02:00:18.020877 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.020959 kubelet[2820]: E0310 02:00:18.020894 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.025559 kubelet[2820]: E0310 02:00:18.025539 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.026069 kubelet[2820]: W0310 02:00:18.025816 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.026201 kubelet[2820]: E0310 02:00:18.026183 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.028366 kubelet[2820]: E0310 02:00:18.028339 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.028631 kubelet[2820]: W0310 02:00:18.028611 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.028724 kubelet[2820]: E0310 02:00:18.028707 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.031092 kubelet[2820]: E0310 02:00:18.031074 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.031250 kubelet[2820]: W0310 02:00:18.031231 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.031333 kubelet[2820]: E0310 02:00:18.031316 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.034403 kubelet[2820]: E0310 02:00:18.034385 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.034715 kubelet[2820]: W0310 02:00:18.034696 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.034833 kubelet[2820]: E0310 02:00:18.034817 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.037261 kubelet[2820]: E0310 02:00:18.037098 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.038981 kubelet[2820]: W0310 02:00:18.037933 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.038981 kubelet[2820]: E0310 02:00:18.037960 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.041373 kubelet[2820]: E0310 02:00:18.040244 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.041373 kubelet[2820]: W0310 02:00:18.040346 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.041373 kubelet[2820]: E0310 02:00:18.040365 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.041740 systemd[1]: Started cri-containerd-6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5.scope - libcontainer container 6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5. Mar 10 02:00:18.058679 kubelet[2820]: E0310 02:00:18.054677 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.058679 kubelet[2820]: W0310 02:00:18.054715 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.058679 kubelet[2820]: E0310 02:00:18.054746 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.061296 kubelet[2820]: E0310 02:00:18.060645 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.061296 kubelet[2820]: W0310 02:00:18.060770 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.061296 kubelet[2820]: E0310 02:00:18.060895 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.073223 containerd[1555]: time="2026-03-10T02:00:18.073175520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-86c7595d64-twpmx,Uid:36a6f105-a54d-4d22-b56b-b9af41f8a6c3,Namespace:calico-system,Attempt:0,} returns sandbox id \"7474e12bc0975db289b41c972036c596a721b395105355f873c63b1cd199806d\"" Mar 10 02:00:18.073731 kubelet[2820]: E0310 02:00:18.073652 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.073731 kubelet[2820]: W0310 02:00:18.073714 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.073925 kubelet[2820]: E0310 02:00:18.073745 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.079698 kubelet[2820]: E0310 02:00:18.079671 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.079837 kubelet[2820]: W0310 02:00:18.079815 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.079952 kubelet[2820]: E0310 02:00:18.079932 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.084862 kubelet[2820]: E0310 02:00:18.084836 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.084980 kubelet[2820]: W0310 02:00:18.084960 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.085095 kubelet[2820]: E0310 02:00:18.085075 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.089076 kubelet[2820]: E0310 02:00:18.089058 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.089229 kubelet[2820]: W0310 02:00:18.089204 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.089337 kubelet[2820]: E0310 02:00:18.089320 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.090566 kubelet[2820]: E0310 02:00:18.090548 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.090769 kubelet[2820]: W0310 02:00:18.090681 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.090886 kubelet[2820]: E0310 02:00:18.090867 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.097671 containerd[1555]: time="2026-03-10T02:00:18.097583462Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 10 02:00:18.097832 kubelet[2820]: E0310 02:00:18.097814 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.097938 kubelet[2820]: W0310 02:00:18.097919 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.098226 kubelet[2820]: E0310 02:00:18.098208 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.102840 kubelet[2820]: E0310 02:00:18.102818 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.103075 kubelet[2820]: W0310 02:00:18.103058 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.103391 kubelet[2820]: E0310 02:00:18.103332 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.107719 kubelet[2820]: E0310 02:00:18.107418 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.107719 kubelet[2820]: W0310 02:00:18.107439 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.107719 kubelet[2820]: E0310 02:00:18.107528 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.109006 kubelet[2820]: E0310 02:00:18.108987 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.109553 kubelet[2820]: W0310 02:00:18.109249 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.109553 kubelet[2820]: E0310 02:00:18.109272 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.113943 kubelet[2820]: E0310 02:00:18.113919 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.114793 kubelet[2820]: W0310 02:00:18.114767 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.116052 kubelet[2820]: E0310 02:00:18.115055 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.121065 kubelet[2820]: E0310 02:00:18.120348 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.121065 kubelet[2820]: W0310 02:00:18.120397 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.121065 kubelet[2820]: E0310 02:00:18.120424 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.123247 kubelet[2820]: E0310 02:00:18.122907 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.123247 kubelet[2820]: W0310 02:00:18.122925 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.123247 kubelet[2820]: E0310 02:00:18.122942 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.180047 kubelet[2820]: E0310 02:00:18.179874 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:18.180047 kubelet[2820]: W0310 02:00:18.179907 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:18.180047 kubelet[2820]: E0310 02:00:18.179937 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:18.283440 containerd[1555]: time="2026-03-10T02:00:18.276585996Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-7kz5z,Uid:5e68d9a1-96e4-41fa-997f-ef2ba4f3ece8,Namespace:calico-system,Attempt:0,} returns sandbox id \"6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5\"" Mar 10 02:00:19.222588 kubelet[2820]: E0310 02:00:19.221876 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:20.983833 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2464545481.mount: Deactivated successfully. Mar 10 02:00:21.222193 kubelet[2820]: E0310 02:00:21.222138 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:23.222906 kubelet[2820]: E0310 02:00:23.222282 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:25.223710 kubelet[2820]: E0310 02:00:25.223411 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:25.271885 containerd[1555]: time="2026-03-10T02:00:25.269046592Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:00:25.288186 containerd[1555]: time="2026-03-10T02:00:25.288114760Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 10 02:00:25.299827 containerd[1555]: time="2026-03-10T02:00:25.296291981Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:00:25.315778 containerd[1555]: time="2026-03-10T02:00:25.313002293Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:00:25.323801 containerd[1555]: time="2026-03-10T02:00:25.323038512Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 7.225400707s" Mar 10 02:00:25.323801 containerd[1555]: time="2026-03-10T02:00:25.323087476Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 10 02:00:25.340629 containerd[1555]: time="2026-03-10T02:00:25.335803535Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 10 02:00:25.417951 containerd[1555]: time="2026-03-10T02:00:25.417899991Z" level=info msg="CreateContainer within sandbox \"7474e12bc0975db289b41c972036c596a721b395105355f873c63b1cd199806d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 10 02:00:25.491925 containerd[1555]: time="2026-03-10T02:00:25.491249056Z" level=info msg="Container a6314a9c6fb55079ed424dfec491c6a54afb2f61ab1885b6d2e7d93b0f971192: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:00:25.554137 containerd[1555]: time="2026-03-10T02:00:25.553920998Z" level=info msg="CreateContainer within sandbox \"7474e12bc0975db289b41c972036c596a721b395105355f873c63b1cd199806d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a6314a9c6fb55079ed424dfec491c6a54afb2f61ab1885b6d2e7d93b0f971192\"" Mar 10 02:00:25.556716 containerd[1555]: time="2026-03-10T02:00:25.556671192Z" level=info msg="StartContainer for \"a6314a9c6fb55079ed424dfec491c6a54afb2f61ab1885b6d2e7d93b0f971192\"" Mar 10 02:00:25.566413 containerd[1555]: time="2026-03-10T02:00:25.566334527Z" level=info msg="connecting to shim a6314a9c6fb55079ed424dfec491c6a54afb2f61ab1885b6d2e7d93b0f971192" address="unix:///run/containerd/s/72abe40d8657ac6e5e1436ea28a23f2a9335c11d22ce4baf0e4c9c2e9b404997" protocol=ttrpc version=3 Mar 10 02:00:25.695916 systemd[1]: Started cri-containerd-a6314a9c6fb55079ed424dfec491c6a54afb2f61ab1885b6d2e7d93b0f971192.scope - libcontainer container a6314a9c6fb55079ed424dfec491c6a54afb2f61ab1885b6d2e7d93b0f971192. Mar 10 02:00:26.112365 containerd[1555]: time="2026-03-10T02:00:26.111993886Z" level=info msg="StartContainer for \"a6314a9c6fb55079ed424dfec491c6a54afb2f61ab1885b6d2e7d93b0f971192\" returns successfully" Mar 10 02:00:26.986082 kubelet[2820]: E0310 02:00:26.985958 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:26.986082 kubelet[2820]: W0310 02:00:26.986027 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:26.986082 kubelet[2820]: E0310 02:00:26.986056 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:26.994814 kubelet[2820]: E0310 02:00:26.989163 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:26.994814 kubelet[2820]: W0310 02:00:26.989184 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:26.994814 kubelet[2820]: E0310 02:00:26.989203 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:26.994814 kubelet[2820]: E0310 02:00:26.992731 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:26.994814 kubelet[2820]: W0310 02:00:26.992743 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:26.994814 kubelet[2820]: E0310 02:00:26.992755 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:26.994814 kubelet[2820]: E0310 02:00:26.993263 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:26.994814 kubelet[2820]: W0310 02:00:26.993276 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:26.994814 kubelet[2820]: E0310 02:00:26.993290 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:26.996518 kubelet[2820]: E0310 02:00:26.996186 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:26.996518 kubelet[2820]: W0310 02:00:26.996222 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:26.996518 kubelet[2820]: E0310 02:00:26.996236 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.014244 kubelet[2820]: E0310 02:00:27.004216 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.014244 kubelet[2820]: W0310 02:00:27.009682 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.014244 kubelet[2820]: E0310 02:00:27.009730 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.032780 kubelet[2820]: E0310 02:00:27.030234 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.032780 kubelet[2820]: W0310 02:00:27.030259 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.032780 kubelet[2820]: E0310 02:00:27.030283 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.048813 kubelet[2820]: E0310 02:00:27.046374 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.048813 kubelet[2820]: W0310 02:00:27.046419 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.048813 kubelet[2820]: E0310 02:00:27.046445 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.054959 kubelet[2820]: E0310 02:00:27.052510 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.054959 kubelet[2820]: W0310 02:00:27.053900 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.054959 kubelet[2820]: E0310 02:00:27.053944 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.054959 kubelet[2820]: E0310 02:00:27.054657 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.068658 kubelet[2820]: W0310 02:00:27.054730 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.068658 kubelet[2820]: E0310 02:00:27.060037 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.068658 kubelet[2820]: E0310 02:00:27.065885 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.068658 kubelet[2820]: W0310 02:00:27.065909 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.068658 kubelet[2820]: E0310 02:00:27.065937 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.069163 kubelet[2820]: E0310 02:00:27.069030 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.069163 kubelet[2820]: W0310 02:00:27.069100 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.069163 kubelet[2820]: E0310 02:00:27.069125 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.082434 kubelet[2820]: E0310 02:00:27.080378 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.089799 kubelet[2820]: W0310 02:00:27.089707 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.089799 kubelet[2820]: E0310 02:00:27.089782 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.090519 kubelet[2820]: E0310 02:00:27.090317 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.090519 kubelet[2820]: W0310 02:00:27.090366 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.090519 kubelet[2820]: E0310 02:00:27.090386 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.091089 kubelet[2820]: E0310 02:00:27.090934 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.091089 kubelet[2820]: W0310 02:00:27.090981 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.091089 kubelet[2820]: E0310 02:00:27.090999 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.099137 kubelet[2820]: E0310 02:00:27.096232 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.100121 kubelet[2820]: W0310 02:00:27.099951 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.100121 kubelet[2820]: E0310 02:00:27.100013 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.128701 kubelet[2820]: E0310 02:00:27.128000 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.128701 kubelet[2820]: W0310 02:00:27.128067 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.128701 kubelet[2820]: E0310 02:00:27.128104 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.160229 kubelet[2820]: E0310 02:00:27.159531 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.160229 kubelet[2820]: W0310 02:00:27.159792 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.160229 kubelet[2820]: E0310 02:00:27.159830 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.164882 kubelet[2820]: E0310 02:00:27.164784 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.164882 kubelet[2820]: W0310 02:00:27.164841 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.164882 kubelet[2820]: E0310 02:00:27.164871 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.170837 kubelet[2820]: E0310 02:00:27.168927 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.170837 kubelet[2820]: W0310 02:00:27.168990 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.170837 kubelet[2820]: E0310 02:00:27.169014 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.204841 kubelet[2820]: E0310 02:00:27.204761 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.204841 kubelet[2820]: W0310 02:00:27.204823 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.205146 kubelet[2820]: E0310 02:00:27.204850 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.218587 kubelet[2820]: E0310 02:00:27.218367 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.218587 kubelet[2820]: W0310 02:00:27.218393 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.218587 kubelet[2820]: E0310 02:00:27.218419 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.219258 kubelet[2820]: E0310 02:00:27.219189 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.219258 kubelet[2820]: W0310 02:00:27.219242 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.219375 kubelet[2820]: E0310 02:00:27.219263 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.222085 kubelet[2820]: E0310 02:00:27.221867 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.222163 kubelet[2820]: W0310 02:00:27.222089 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.222163 kubelet[2820]: E0310 02:00:27.222111 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.222865 kubelet[2820]: E0310 02:00:27.222790 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.222865 kubelet[2820]: W0310 02:00:27.222842 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.222865 kubelet[2820]: E0310 02:00:27.222861 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.230003 kubelet[2820]: E0310 02:00:27.229110 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:27.230003 kubelet[2820]: E0310 02:00:27.229044 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.230003 kubelet[2820]: W0310 02:00:27.229168 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.230003 kubelet[2820]: E0310 02:00:27.229185 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.233311 kubelet[2820]: E0310 02:00:27.231214 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.233311 kubelet[2820]: W0310 02:00:27.231264 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.233311 kubelet[2820]: E0310 02:00:27.231281 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.233311 kubelet[2820]: E0310 02:00:27.231628 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.233311 kubelet[2820]: W0310 02:00:27.231641 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.233311 kubelet[2820]: E0310 02:00:27.231656 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.233311 kubelet[2820]: E0310 02:00:27.231876 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.233311 kubelet[2820]: W0310 02:00:27.231889 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.233311 kubelet[2820]: E0310 02:00:27.231900 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.236602 kubelet[2820]: E0310 02:00:27.236399 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.237079 kubelet[2820]: W0310 02:00:27.236671 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.237079 kubelet[2820]: E0310 02:00:27.236692 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.242434 kubelet[2820]: E0310 02:00:27.240442 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.242434 kubelet[2820]: W0310 02:00:27.240534 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.242434 kubelet[2820]: E0310 02:00:27.240587 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.242434 kubelet[2820]: E0310 02:00:27.244622 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.242434 kubelet[2820]: W0310 02:00:27.244745 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.242434 kubelet[2820]: E0310 02:00:27.244765 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.260858 kubelet[2820]: E0310 02:00:27.253608 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:27.260858 kubelet[2820]: W0310 02:00:27.253742 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:27.260858 kubelet[2820]: E0310 02:00:27.253842 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:27.283523 kubelet[2820]: I0310 02:00:27.280998 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-86c7595d64-twpmx" podStartSLOduration=4.050222773 podStartE2EDuration="11.28097382s" podCreationTimestamp="2026-03-10 02:00:16 +0000 UTC" firstStartedPulling="2026-03-10 02:00:18.094419056 +0000 UTC m=+54.533032967" lastFinishedPulling="2026-03-10 02:00:25.325170113 +0000 UTC m=+61.763784014" observedRunningTime="2026-03-10 02:00:27.135297005 +0000 UTC m=+63.573910906" watchObservedRunningTime="2026-03-10 02:00:27.28097382 +0000 UTC m=+63.719587721" Mar 10 02:00:27.613836 containerd[1555]: time="2026-03-10T02:00:27.610158228Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:00:27.620301 containerd[1555]: time="2026-03-10T02:00:27.619815889Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 10 02:00:27.626114 containerd[1555]: time="2026-03-10T02:00:27.622248606Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:00:27.632775 containerd[1555]: time="2026-03-10T02:00:27.631364154Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:00:27.634347 containerd[1555]: time="2026-03-10T02:00:27.634160494Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 2.298209687s" Mar 10 02:00:27.634347 containerd[1555]: time="2026-03-10T02:00:27.634236609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 10 02:00:27.662990 containerd[1555]: time="2026-03-10T02:00:27.662383285Z" level=info msg="CreateContainer within sandbox \"6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 10 02:00:27.722571 containerd[1555]: time="2026-03-10T02:00:27.722521469Z" level=info msg="Container 975ed76ed1e360e33571761890c5917195c19adbdec38cdf654b7b86ebe4b711: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:00:27.751369 containerd[1555]: time="2026-03-10T02:00:27.751220936Z" level=info msg="CreateContainer within sandbox \"6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"975ed76ed1e360e33571761890c5917195c19adbdec38cdf654b7b86ebe4b711\"" Mar 10 02:00:27.753259 containerd[1555]: time="2026-03-10T02:00:27.752127155Z" level=info msg="StartContainer for \"975ed76ed1e360e33571761890c5917195c19adbdec38cdf654b7b86ebe4b711\"" Mar 10 02:00:27.755092 containerd[1555]: time="2026-03-10T02:00:27.755043931Z" level=info msg="connecting to shim 975ed76ed1e360e33571761890c5917195c19adbdec38cdf654b7b86ebe4b711" address="unix:///run/containerd/s/c50a6d0640db43244fb30d17e478f945a7722d1644a1e93a0435271b137642bf" protocol=ttrpc version=3 Mar 10 02:00:27.818057 systemd[1]: Started cri-containerd-975ed76ed1e360e33571761890c5917195c19adbdec38cdf654b7b86ebe4b711.scope - libcontainer container 975ed76ed1e360e33571761890c5917195c19adbdec38cdf654b7b86ebe4b711. Mar 10 02:00:28.021999 kubelet[2820]: E0310 02:00:28.020442 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:28.021999 kubelet[2820]: W0310 02:00:28.020550 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:28.021999 kubelet[2820]: E0310 02:00:28.020581 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:28.027016 kubelet[2820]: E0310 02:00:28.022276 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:28.027016 kubelet[2820]: W0310 02:00:28.022401 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:28.027016 kubelet[2820]: E0310 02:00:28.022423 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:28.027174 containerd[1555]: time="2026-03-10T02:00:28.024164402Z" level=info msg="StartContainer for \"975ed76ed1e360e33571761890c5917195c19adbdec38cdf654b7b86ebe4b711\" returns successfully" Mar 10 02:00:28.027221 kubelet[2820]: E0310 02:00:28.027114 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:28.027221 kubelet[2820]: W0310 02:00:28.027130 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:28.027221 kubelet[2820]: E0310 02:00:28.027149 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:28.028161 kubelet[2820]: E0310 02:00:28.027996 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:28.028161 kubelet[2820]: W0310 02:00:28.028046 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:28.028161 kubelet[2820]: E0310 02:00:28.028068 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:28.028560 kubelet[2820]: E0310 02:00:28.028336 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:28.028560 kubelet[2820]: W0310 02:00:28.028397 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:28.034773 kubelet[2820]: E0310 02:00:28.028417 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:28.035928 kubelet[2820]: E0310 02:00:28.035506 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:28.035928 kubelet[2820]: W0310 02:00:28.035548 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:28.035928 kubelet[2820]: E0310 02:00:28.035570 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:28.037210 kubelet[2820]: E0310 02:00:28.037111 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:28.037210 kubelet[2820]: W0310 02:00:28.037162 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:28.037210 kubelet[2820]: E0310 02:00:28.037181 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:28.039013 kubelet[2820]: E0310 02:00:28.038125 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:28.039013 kubelet[2820]: W0310 02:00:28.038138 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:28.039013 kubelet[2820]: E0310 02:00:28.038151 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:28.040256 kubelet[2820]: E0310 02:00:28.039700 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:28.040256 kubelet[2820]: W0310 02:00:28.039714 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:28.040256 kubelet[2820]: E0310 02:00:28.039726 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:28.047778 kubelet[2820]: E0310 02:00:28.045890 2820 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 10 02:00:28.047778 kubelet[2820]: W0310 02:00:28.045940 2820 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 10 02:00:28.047778 kubelet[2820]: E0310 02:00:28.045958 2820 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 10 02:00:28.053510 systemd[1]: cri-containerd-975ed76ed1e360e33571761890c5917195c19adbdec38cdf654b7b86ebe4b711.scope: Deactivated successfully. Mar 10 02:00:28.061238 containerd[1555]: time="2026-03-10T02:00:28.060952277Z" level=info msg="received container exit event container_id:\"975ed76ed1e360e33571761890c5917195c19adbdec38cdf654b7b86ebe4b711\" id:\"975ed76ed1e360e33571761890c5917195c19adbdec38cdf654b7b86ebe4b711\" pid:3527 exited_at:{seconds:1773108028 nanos:58118798}" Mar 10 02:00:28.202861 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-975ed76ed1e360e33571761890c5917195c19adbdec38cdf654b7b86ebe4b711-rootfs.mount: Deactivated successfully. Mar 10 02:00:28.994264 containerd[1555]: time="2026-03-10T02:00:28.994100853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 10 02:00:29.222406 kubelet[2820]: E0310 02:00:29.222167 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:31.224926 kubelet[2820]: E0310 02:00:31.223179 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:33.225341 kubelet[2820]: E0310 02:00:33.221793 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:35.223943 kubelet[2820]: E0310 02:00:35.222413 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:37.234193 kubelet[2820]: E0310 02:00:37.233433 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:39.228188 kubelet[2820]: E0310 02:00:39.227889 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:41.223112 kubelet[2820]: E0310 02:00:41.222841 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:43.226169 kubelet[2820]: E0310 02:00:43.224158 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:45.221967 kubelet[2820]: E0310 02:00:45.221841 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:47.226933 kubelet[2820]: E0310 02:00:47.223855 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:49.229840 kubelet[2820]: E0310 02:00:49.223659 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:51.222173 kubelet[2820]: E0310 02:00:51.221858 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:53.222727 kubelet[2820]: E0310 02:00:53.222663 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:53.395566 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1366352413.mount: Deactivated successfully. Mar 10 02:00:53.495387 containerd[1555]: time="2026-03-10T02:00:53.495197218Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:00:53.497363 containerd[1555]: time="2026-03-10T02:00:53.497309957Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 10 02:00:53.501576 containerd[1555]: time="2026-03-10T02:00:53.501398005Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:00:53.507759 containerd[1555]: time="2026-03-10T02:00:53.507632544Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:00:53.508942 containerd[1555]: time="2026-03-10T02:00:53.508663100Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 24.514514215s" Mar 10 02:00:53.508942 containerd[1555]: time="2026-03-10T02:00:53.508733666Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 10 02:00:53.551984 containerd[1555]: time="2026-03-10T02:00:53.551629354Z" level=info msg="CreateContainer within sandbox \"6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 10 02:00:53.639347 containerd[1555]: time="2026-03-10T02:00:53.638594127Z" level=info msg="Container 02371d463f0e325cb63b0658a3b560573507d6bf43d0f4a291eac7ce7ca5511c: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:00:53.723561 containerd[1555]: time="2026-03-10T02:00:53.723405764Z" level=info msg="CreateContainer within sandbox \"6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"02371d463f0e325cb63b0658a3b560573507d6bf43d0f4a291eac7ce7ca5511c\"" Mar 10 02:00:53.725432 containerd[1555]: time="2026-03-10T02:00:53.724939595Z" level=info msg="StartContainer for \"02371d463f0e325cb63b0658a3b560573507d6bf43d0f4a291eac7ce7ca5511c\"" Mar 10 02:00:53.730144 containerd[1555]: time="2026-03-10T02:00:53.728627974Z" level=info msg="connecting to shim 02371d463f0e325cb63b0658a3b560573507d6bf43d0f4a291eac7ce7ca5511c" address="unix:///run/containerd/s/c50a6d0640db43244fb30d17e478f945a7722d1644a1e93a0435271b137642bf" protocol=ttrpc version=3 Mar 10 02:00:53.816177 systemd[1]: Started cri-containerd-02371d463f0e325cb63b0658a3b560573507d6bf43d0f4a291eac7ce7ca5511c.scope - libcontainer container 02371d463f0e325cb63b0658a3b560573507d6bf43d0f4a291eac7ce7ca5511c. Mar 10 02:00:54.021937 containerd[1555]: time="2026-03-10T02:00:54.021874552Z" level=info msg="StartContainer for \"02371d463f0e325cb63b0658a3b560573507d6bf43d0f4a291eac7ce7ca5511c\" returns successfully" Mar 10 02:00:54.119512 systemd[1]: cri-containerd-02371d463f0e325cb63b0658a3b560573507d6bf43d0f4a291eac7ce7ca5511c.scope: Deactivated successfully. Mar 10 02:00:54.135119 containerd[1555]: time="2026-03-10T02:00:54.134940320Z" level=info msg="received container exit event container_id:\"02371d463f0e325cb63b0658a3b560573507d6bf43d0f4a291eac7ce7ca5511c\" id:\"02371d463f0e325cb63b0658a3b560573507d6bf43d0f4a291eac7ce7ca5511c\" pid:3604 exited_at:{seconds:1773108054 nanos:134402697}" Mar 10 02:00:54.243642 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-02371d463f0e325cb63b0658a3b560573507d6bf43d0f4a291eac7ce7ca5511c-rootfs.mount: Deactivated successfully. Mar 10 02:00:55.222202 kubelet[2820]: E0310 02:00:55.221577 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:55.291152 containerd[1555]: time="2026-03-10T02:00:55.290687098Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 10 02:00:57.227599 kubelet[2820]: E0310 02:00:57.222347 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:00:59.223299 kubelet[2820]: E0310 02:00:59.223074 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:01:01.222241 kubelet[2820]: E0310 02:01:01.222036 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:01:03.226256 kubelet[2820]: E0310 02:01:03.225223 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:01:05.224037 kubelet[2820]: E0310 02:01:05.223784 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:01:05.411538 containerd[1555]: time="2026-03-10T02:01:05.411271457Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:05.424210 containerd[1555]: time="2026-03-10T02:01:05.424161779Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 10 02:01:05.427126 containerd[1555]: time="2026-03-10T02:01:05.427082509Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:05.435517 containerd[1555]: time="2026-03-10T02:01:05.435347491Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:05.440119 containerd[1555]: time="2026-03-10T02:01:05.439188210Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 10.148451197s" Mar 10 02:01:05.440119 containerd[1555]: time="2026-03-10T02:01:05.439251041Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 10 02:01:05.473432 containerd[1555]: time="2026-03-10T02:01:05.472578275Z" level=info msg="CreateContainer within sandbox \"6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 10 02:01:05.540060 containerd[1555]: time="2026-03-10T02:01:05.534329222Z" level=info msg="Container 84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:01:05.585568 containerd[1555]: time="2026-03-10T02:01:05.580788140Z" level=info msg="CreateContainer within sandbox \"6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f\"" Mar 10 02:01:05.585568 containerd[1555]: time="2026-03-10T02:01:05.581721359Z" level=info msg="StartContainer for \"84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f\"" Mar 10 02:01:05.590946 containerd[1555]: time="2026-03-10T02:01:05.590863994Z" level=info msg="connecting to shim 84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f" address="unix:///run/containerd/s/c50a6d0640db43244fb30d17e478f945a7722d1644a1e93a0435271b137642bf" protocol=ttrpc version=3 Mar 10 02:01:05.664118 systemd[1]: Started cri-containerd-84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f.scope - libcontainer container 84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f. Mar 10 02:01:06.083317 containerd[1555]: time="2026-03-10T02:01:06.083261226Z" level=info msg="StartContainer for \"84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f\" returns successfully" Mar 10 02:01:07.223753 kubelet[2820]: E0310 02:01:07.223393 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:01:07.950247 systemd[1]: cri-containerd-84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f.scope: Deactivated successfully. Mar 10 02:01:07.951121 systemd[1]: cri-containerd-84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f.scope: Consumed 1.138s CPU time, 186.6M memory peak, 852K read from disk, 177M written to disk. Mar 10 02:01:07.957114 containerd[1555]: time="2026-03-10T02:01:07.954300374Z" level=info msg="received container exit event container_id:\"84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f\" id:\"84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f\" pid:3667 exited_at:{seconds:1773108067 nanos:953881847}" Mar 10 02:01:07.989171 kubelet[2820]: I0310 02:01:07.989138 2820 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 10 02:01:08.064104 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-84a97f148dfba73b5dc7614b6683cc7c1b2ccbb8faf561cd6711c4ee7de78e1f-rootfs.mount: Deactivated successfully. Mar 10 02:01:08.443049 systemd[1]: Created slice kubepods-besteffort-pod74613752_983e_41dc_b1b0_4ff0b2a9333f.slice - libcontainer container kubepods-besteffort-pod74613752_983e_41dc_b1b0_4ff0b2a9333f.slice. Mar 10 02:01:08.489245 systemd[1]: Created slice kubepods-burstable-podff236307_737f_4769_9a73_e0e60d4f60ef.slice - libcontainer container kubepods-burstable-podff236307_737f_4769_9a73_e0e60d4f60ef.slice. Mar 10 02:01:08.514933 systemd[1]: Created slice kubepods-besteffort-podc169f044_feb9_4347_930b_a968995715e2.slice - libcontainer container kubepods-besteffort-podc169f044_feb9_4347_930b_a968995715e2.slice. Mar 10 02:01:08.529677 systemd[1]: Created slice kubepods-burstable-podf5a2970b_ee52_4628_9153_e736ecce6760.slice - libcontainer container kubepods-burstable-podf5a2970b_ee52_4628_9153_e736ecce6760.slice. Mar 10 02:01:08.568650 containerd[1555]: time="2026-03-10T02:01:08.566916241Z" level=info msg="CreateContainer within sandbox \"6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 10 02:01:08.572099 systemd[1]: Created slice kubepods-besteffort-poda0d4a605_efd0_4424_86bb_3dd6bae830b7.slice - libcontainer container kubepods-besteffort-poda0d4a605_efd0_4424_86bb_3dd6bae830b7.slice. Mar 10 02:01:08.601774 kubelet[2820]: I0310 02:01:08.601599 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/072ce4e7-1021-481e-a47a-99c585261923-calico-apiserver-certs\") pod \"calico-apiserver-76dfd768bc-pv7bx\" (UID: \"072ce4e7-1021-481e-a47a-99c585261923\") " pod="calico-system/calico-apiserver-76dfd768bc-pv7bx" Mar 10 02:01:08.601774 kubelet[2820]: I0310 02:01:08.601696 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf58l\" (UniqueName: \"kubernetes.io/projected/072ce4e7-1021-481e-a47a-99c585261923-kube-api-access-mf58l\") pod \"calico-apiserver-76dfd768bc-pv7bx\" (UID: \"072ce4e7-1021-481e-a47a-99c585261923\") " pod="calico-system/calico-apiserver-76dfd768bc-pv7bx" Mar 10 02:01:08.601774 kubelet[2820]: I0310 02:01:08.601726 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfcwc\" (UniqueName: \"kubernetes.io/projected/74613752-983e-41dc-b1b0-4ff0b2a9333f-kube-api-access-qfcwc\") pod \"calico-kube-controllers-d6c6bb9bb-bm4js\" (UID: \"74613752-983e-41dc-b1b0-4ff0b2a9333f\") " pod="calico-system/calico-kube-controllers-d6c6bb9bb-bm4js" Mar 10 02:01:08.601774 kubelet[2820]: I0310 02:01:08.601755 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c169f044-feb9-4347-930b-a968995715e2-config\") pod \"goldmane-cccfbd5cf-kt2sm\" (UID: \"c169f044-feb9-4347-930b-a968995715e2\") " pod="calico-system/goldmane-cccfbd5cf-kt2sm" Mar 10 02:01:08.602425 kubelet[2820]: I0310 02:01:08.601790 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c169f044-feb9-4347-930b-a968995715e2-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-kt2sm\" (UID: \"c169f044-feb9-4347-930b-a968995715e2\") " pod="calico-system/goldmane-cccfbd5cf-kt2sm" Mar 10 02:01:08.602425 kubelet[2820]: I0310 02:01:08.601824 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9glbc\" (UniqueName: \"kubernetes.io/projected/a0d4a605-efd0-4424-86bb-3dd6bae830b7-kube-api-access-9glbc\") pod \"calico-apiserver-76dfd768bc-9l6p8\" (UID: \"a0d4a605-efd0-4424-86bb-3dd6bae830b7\") " pod="calico-system/calico-apiserver-76dfd768bc-9l6p8" Mar 10 02:01:08.602425 kubelet[2820]: I0310 02:01:08.601852 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrvvg\" (UniqueName: \"kubernetes.io/projected/f5a2970b-ee52-4628-9153-e736ecce6760-kube-api-access-nrvvg\") pod \"coredns-66bc5c9577-9wb7f\" (UID: \"f5a2970b-ee52-4628-9153-e736ecce6760\") " pod="kube-system/coredns-66bc5c9577-9wb7f" Mar 10 02:01:08.602425 kubelet[2820]: I0310 02:01:08.602015 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-whisker-backend-key-pair\") pod \"whisker-6854c5d7df-97zrk\" (UID: \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\") " pod="calico-system/whisker-6854c5d7df-97zrk" Mar 10 02:01:08.602425 kubelet[2820]: I0310 02:01:08.602042 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvvkj\" (UniqueName: \"kubernetes.io/projected/ff236307-737f-4769-9a73-e0e60d4f60ef-kube-api-access-qvvkj\") pod \"coredns-66bc5c9577-w7tcg\" (UID: \"ff236307-737f-4769-9a73-e0e60d4f60ef\") " pod="kube-system/coredns-66bc5c9577-w7tcg" Mar 10 02:01:08.602749 kubelet[2820]: I0310 02:01:08.602088 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xngx5\" (UniqueName: \"kubernetes.io/projected/c169f044-feb9-4347-930b-a968995715e2-kube-api-access-xngx5\") pod \"goldmane-cccfbd5cf-kt2sm\" (UID: \"c169f044-feb9-4347-930b-a968995715e2\") " pod="calico-system/goldmane-cccfbd5cf-kt2sm" Mar 10 02:01:08.602749 kubelet[2820]: I0310 02:01:08.602128 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a0d4a605-efd0-4424-86bb-3dd6bae830b7-calico-apiserver-certs\") pod \"calico-apiserver-76dfd768bc-9l6p8\" (UID: \"a0d4a605-efd0-4424-86bb-3dd6bae830b7\") " pod="calico-system/calico-apiserver-76dfd768bc-9l6p8" Mar 10 02:01:08.602749 kubelet[2820]: I0310 02:01:08.602160 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-whisker-ca-bundle\") pod \"whisker-6854c5d7df-97zrk\" (UID: \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\") " pod="calico-system/whisker-6854c5d7df-97zrk" Mar 10 02:01:08.606683 kubelet[2820]: I0310 02:01:08.606623 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c169f044-feb9-4347-930b-a968995715e2-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-kt2sm\" (UID: \"c169f044-feb9-4347-930b-a968995715e2\") " pod="calico-system/goldmane-cccfbd5cf-kt2sm" Mar 10 02:01:08.606752 kubelet[2820]: I0310 02:01:08.606709 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5a2970b-ee52-4628-9153-e736ecce6760-config-volume\") pod \"coredns-66bc5c9577-9wb7f\" (UID: \"f5a2970b-ee52-4628-9153-e736ecce6760\") " pod="kube-system/coredns-66bc5c9577-9wb7f" Mar 10 02:01:08.606787 kubelet[2820]: I0310 02:01:08.606755 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74613752-983e-41dc-b1b0-4ff0b2a9333f-tigera-ca-bundle\") pod \"calico-kube-controllers-d6c6bb9bb-bm4js\" (UID: \"74613752-983e-41dc-b1b0-4ff0b2a9333f\") " pod="calico-system/calico-kube-controllers-d6c6bb9bb-bm4js" Mar 10 02:01:08.606813 kubelet[2820]: I0310 02:01:08.606786 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-nginx-config\") pod \"whisker-6854c5d7df-97zrk\" (UID: \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\") " pod="calico-system/whisker-6854c5d7df-97zrk" Mar 10 02:01:08.606835 kubelet[2820]: I0310 02:01:08.606818 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zzcd\" (UniqueName: \"kubernetes.io/projected/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-kube-api-access-4zzcd\") pod \"whisker-6854c5d7df-97zrk\" (UID: \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\") " pod="calico-system/whisker-6854c5d7df-97zrk" Mar 10 02:01:08.606863 kubelet[2820]: I0310 02:01:08.606840 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff236307-737f-4769-9a73-e0e60d4f60ef-config-volume\") pod \"coredns-66bc5c9577-w7tcg\" (UID: \"ff236307-737f-4769-9a73-e0e60d4f60ef\") " pod="kube-system/coredns-66bc5c9577-w7tcg" Mar 10 02:01:08.618859 systemd[1]: Created slice kubepods-besteffort-pod37a20b0e_e8bd_4481_9ea0_d4c45cd1e1ed.slice - libcontainer container kubepods-besteffort-pod37a20b0e_e8bd_4481_9ea0_d4c45cd1e1ed.slice. Mar 10 02:01:08.640880 systemd[1]: Created slice kubepods-besteffort-pod072ce4e7_1021_481e_a47a_99c585261923.slice - libcontainer container kubepods-besteffort-pod072ce4e7_1021_481e_a47a_99c585261923.slice. Mar 10 02:01:08.656681 containerd[1555]: time="2026-03-10T02:01:08.656438625Z" level=info msg="Container a0180ea5349f8eb1c4fa0cc3ace94881aa244a4761202918337f0aa48c8bd28e: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:01:08.662935 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount20967798.mount: Deactivated successfully. Mar 10 02:01:08.706606 containerd[1555]: time="2026-03-10T02:01:08.700302233Z" level=info msg="CreateContainer within sandbox \"6a9e78f22163f0e7eae3dafc40ae931736c47abe5b2185e30333f9eac360b9f5\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"a0180ea5349f8eb1c4fa0cc3ace94881aa244a4761202918337f0aa48c8bd28e\"" Mar 10 02:01:08.706606 containerd[1555]: time="2026-03-10T02:01:08.705085535Z" level=info msg="StartContainer for \"a0180ea5349f8eb1c4fa0cc3ace94881aa244a4761202918337f0aa48c8bd28e\"" Mar 10 02:01:08.721960 containerd[1555]: time="2026-03-10T02:01:08.721914876Z" level=info msg="connecting to shim a0180ea5349f8eb1c4fa0cc3ace94881aa244a4761202918337f0aa48c8bd28e" address="unix:///run/containerd/s/c50a6d0640db43244fb30d17e478f945a7722d1644a1e93a0435271b137642bf" protocol=ttrpc version=3 Mar 10 02:01:08.793752 systemd[1]: Started cri-containerd-a0180ea5349f8eb1c4fa0cc3ace94881aa244a4761202918337f0aa48c8bd28e.scope - libcontainer container a0180ea5349f8eb1c4fa0cc3ace94881aa244a4761202918337f0aa48c8bd28e. Mar 10 02:01:08.830619 containerd[1555]: time="2026-03-10T02:01:08.830311239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-kt2sm,Uid:c169f044-feb9-4347-930b-a968995715e2,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:08.852223 containerd[1555]: time="2026-03-10T02:01:08.851899582Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9wb7f,Uid:f5a2970b-ee52-4628-9153-e736ecce6760,Namespace:kube-system,Attempt:0,}" Mar 10 02:01:08.923419 containerd[1555]: time="2026-03-10T02:01:08.923212403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76dfd768bc-9l6p8,Uid:a0d4a605-efd0-4424-86bb-3dd6bae830b7,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:08.939406 containerd[1555]: time="2026-03-10T02:01:08.939138067Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6854c5d7df-97zrk,Uid:37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:09.009322 containerd[1555]: time="2026-03-10T02:01:09.009052924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76dfd768bc-pv7bx,Uid:072ce4e7-1021-481e-a47a-99c585261923,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:09.076032 containerd[1555]: time="2026-03-10T02:01:09.075600103Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6c6bb9bb-bm4js,Uid:74613752-983e-41dc-b1b0-4ff0b2a9333f,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:09.107600 containerd[1555]: time="2026-03-10T02:01:09.107551763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-w7tcg,Uid:ff236307-737f-4769-9a73-e0e60d4f60ef,Namespace:kube-system,Attempt:0,}" Mar 10 02:01:09.235098 systemd[1]: Created slice kubepods-besteffort-pod17109937_e605_4060_a474_8701d718389b.slice - libcontainer container kubepods-besteffort-pod17109937_e605_4060_a474_8701d718389b.slice. Mar 10 02:01:09.272062 containerd[1555]: time="2026-03-10T02:01:09.270652401Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9jkrd,Uid:17109937-e605-4060-a474-8701d718389b,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:09.379082 containerd[1555]: time="2026-03-10T02:01:09.378958526Z" level=info msg="StartContainer for \"a0180ea5349f8eb1c4fa0cc3ace94881aa244a4761202918337f0aa48c8bd28e\" returns successfully" Mar 10 02:01:09.405634 containerd[1555]: time="2026-03-10T02:01:09.405576834Z" level=error msg="Failed to destroy network for sandbox \"f903e55d3b4a3f5717c0db7849bd63f9255e557b31ba9b20c187c90db1802a1b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.417370 systemd[1]: run-netns-cni\x2d74d866a8\x2d4bce\x2d6443\x2d2c51\x2da98f99f7dd02.mount: Deactivated successfully. Mar 10 02:01:09.419692 containerd[1555]: time="2026-03-10T02:01:09.419550271Z" level=error msg="Failed to destroy network for sandbox \"415fc9ff472d3ffbbc3dff03ac79eeff00603594799ec8d0e1f8fe21b3a776f1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.424556 containerd[1555]: time="2026-03-10T02:01:09.424361752Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6854c5d7df-97zrk,Uid:37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f903e55d3b4a3f5717c0db7849bd63f9255e557b31ba9b20c187c90db1802a1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.441993 containerd[1555]: time="2026-03-10T02:01:09.441934205Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76dfd768bc-9l6p8,Uid:a0d4a605-efd0-4424-86bb-3dd6bae830b7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"415fc9ff472d3ffbbc3dff03ac79eeff00603594799ec8d0e1f8fe21b3a776f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.487030 containerd[1555]: time="2026-03-10T02:01:09.483537777Z" level=error msg="Failed to destroy network for sandbox \"4da13af900a1cf1e006e407af33cba8debe083fb4ef3f56faab7dfc6b8db64e7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.499628 containerd[1555]: time="2026-03-10T02:01:09.496350950Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6c6bb9bb-bm4js,Uid:74613752-983e-41dc-b1b0-4ff0b2a9333f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4da13af900a1cf1e006e407af33cba8debe083fb4ef3f56faab7dfc6b8db64e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.519692 containerd[1555]: time="2026-03-10T02:01:09.516953750Z" level=error msg="Failed to destroy network for sandbox \"f05f922e3e04b9a82961177cbb43426e31c15adbb7d86206c942b9a0e3a0ec79\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.526504 containerd[1555]: time="2026-03-10T02:01:09.523817491Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76dfd768bc-pv7bx,Uid:072ce4e7-1021-481e-a47a-99c585261923,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f05f922e3e04b9a82961177cbb43426e31c15adbb7d86206c942b9a0e3a0ec79\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.528694 kubelet[2820]: E0310 02:01:09.527880 2820 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f05f922e3e04b9a82961177cbb43426e31c15adbb7d86206c942b9a0e3a0ec79\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.528694 kubelet[2820]: E0310 02:01:09.528018 2820 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f05f922e3e04b9a82961177cbb43426e31c15adbb7d86206c942b9a0e3a0ec79\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-76dfd768bc-pv7bx" Mar 10 02:01:09.528694 kubelet[2820]: E0310 02:01:09.528047 2820 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f05f922e3e04b9a82961177cbb43426e31c15adbb7d86206c942b9a0e3a0ec79\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-76dfd768bc-pv7bx" Mar 10 02:01:09.529372 kubelet[2820]: E0310 02:01:09.528115 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76dfd768bc-pv7bx_calico-system(072ce4e7-1021-481e-a47a-99c585261923)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76dfd768bc-pv7bx_calico-system(072ce4e7-1021-481e-a47a-99c585261923)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f05f922e3e04b9a82961177cbb43426e31c15adbb7d86206c942b9a0e3a0ec79\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-76dfd768bc-pv7bx" podUID="072ce4e7-1021-481e-a47a-99c585261923" Mar 10 02:01:09.533894 kubelet[2820]: E0310 02:01:09.533675 2820 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f903e55d3b4a3f5717c0db7849bd63f9255e557b31ba9b20c187c90db1802a1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.533894 kubelet[2820]: E0310 02:01:09.533769 2820 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f903e55d3b4a3f5717c0db7849bd63f9255e557b31ba9b20c187c90db1802a1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6854c5d7df-97zrk" Mar 10 02:01:09.533894 kubelet[2820]: E0310 02:01:09.533793 2820 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f903e55d3b4a3f5717c0db7849bd63f9255e557b31ba9b20c187c90db1802a1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6854c5d7df-97zrk" Mar 10 02:01:09.534564 kubelet[2820]: E0310 02:01:09.533853 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6854c5d7df-97zrk_calico-system(37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6854c5d7df-97zrk_calico-system(37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f903e55d3b4a3f5717c0db7849bd63f9255e557b31ba9b20c187c90db1802a1b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6854c5d7df-97zrk" podUID="37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed" Mar 10 02:01:09.534564 kubelet[2820]: E0310 02:01:09.533922 2820 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"415fc9ff472d3ffbbc3dff03ac79eeff00603594799ec8d0e1f8fe21b3a776f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.534564 kubelet[2820]: E0310 02:01:09.533948 2820 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"415fc9ff472d3ffbbc3dff03ac79eeff00603594799ec8d0e1f8fe21b3a776f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-76dfd768bc-9l6p8" Mar 10 02:01:09.537667 kubelet[2820]: E0310 02:01:09.533967 2820 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"415fc9ff472d3ffbbc3dff03ac79eeff00603594799ec8d0e1f8fe21b3a776f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-76dfd768bc-9l6p8" Mar 10 02:01:09.537667 kubelet[2820]: E0310 02:01:09.534009 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76dfd768bc-9l6p8_calico-system(a0d4a605-efd0-4424-86bb-3dd6bae830b7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76dfd768bc-9l6p8_calico-system(a0d4a605-efd0-4424-86bb-3dd6bae830b7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"415fc9ff472d3ffbbc3dff03ac79eeff00603594799ec8d0e1f8fe21b3a776f1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-76dfd768bc-9l6p8" podUID="a0d4a605-efd0-4424-86bb-3dd6bae830b7" Mar 10 02:01:09.537667 kubelet[2820]: E0310 02:01:09.534083 2820 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4da13af900a1cf1e006e407af33cba8debe083fb4ef3f56faab7dfc6b8db64e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.537861 kubelet[2820]: E0310 02:01:09.534113 2820 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4da13af900a1cf1e006e407af33cba8debe083fb4ef3f56faab7dfc6b8db64e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6c6bb9bb-bm4js" Mar 10 02:01:09.537861 kubelet[2820]: E0310 02:01:09.534130 2820 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4da13af900a1cf1e006e407af33cba8debe083fb4ef3f56faab7dfc6b8db64e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-d6c6bb9bb-bm4js" Mar 10 02:01:09.537861 kubelet[2820]: E0310 02:01:09.534164 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-d6c6bb9bb-bm4js_calico-system(74613752-983e-41dc-b1b0-4ff0b2a9333f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-d6c6bb9bb-bm4js_calico-system(74613752-983e-41dc-b1b0-4ff0b2a9333f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4da13af900a1cf1e006e407af33cba8debe083fb4ef3f56faab7dfc6b8db64e7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-d6c6bb9bb-bm4js" podUID="74613752-983e-41dc-b1b0-4ff0b2a9333f" Mar 10 02:01:09.574629 containerd[1555]: time="2026-03-10T02:01:09.572855598Z" level=error msg="Failed to destroy network for sandbox \"b59afb6a0a2af5dafb87b6949f0f04a754ce44d0b8c4546bf5497e97b3c0336a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.597271 containerd[1555]: time="2026-03-10T02:01:09.595433634Z" level=error msg="Failed to destroy network for sandbox \"7fdebc4ebe88799f3d03c81209e991e8b501beca96123c48ef9e09b8839e8bce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.623561 containerd[1555]: time="2026-03-10T02:01:09.622763543Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-kt2sm,Uid:c169f044-feb9-4347-930b-a968995715e2,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b59afb6a0a2af5dafb87b6949f0f04a754ce44d0b8c4546bf5497e97b3c0336a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.625928 kubelet[2820]: E0310 02:01:09.625688 2820 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b59afb6a0a2af5dafb87b6949f0f04a754ce44d0b8c4546bf5497e97b3c0336a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.625928 kubelet[2820]: E0310 02:01:09.625758 2820 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b59afb6a0a2af5dafb87b6949f0f04a754ce44d0b8c4546bf5497e97b3c0336a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-kt2sm" Mar 10 02:01:09.625928 kubelet[2820]: E0310 02:01:09.625783 2820 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b59afb6a0a2af5dafb87b6949f0f04a754ce44d0b8c4546bf5497e97b3c0336a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-kt2sm" Mar 10 02:01:09.626645 kubelet[2820]: E0310 02:01:09.625839 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-kt2sm_calico-system(c169f044-feb9-4347-930b-a968995715e2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-kt2sm_calico-system(c169f044-feb9-4347-930b-a968995715e2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b59afb6a0a2af5dafb87b6949f0f04a754ce44d0b8c4546bf5497e97b3c0336a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-kt2sm" podUID="c169f044-feb9-4347-930b-a968995715e2" Mar 10 02:01:09.635168 containerd[1555]: time="2026-03-10T02:01:09.634908498Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-w7tcg,Uid:ff236307-737f-4769-9a73-e0e60d4f60ef,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fdebc4ebe88799f3d03c81209e991e8b501beca96123c48ef9e09b8839e8bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.635571 kubelet[2820]: E0310 02:01:09.635173 2820 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fdebc4ebe88799f3d03c81209e991e8b501beca96123c48ef9e09b8839e8bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.635571 kubelet[2820]: E0310 02:01:09.635230 2820 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fdebc4ebe88799f3d03c81209e991e8b501beca96123c48ef9e09b8839e8bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-w7tcg" Mar 10 02:01:09.635571 kubelet[2820]: E0310 02:01:09.635261 2820 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fdebc4ebe88799f3d03c81209e991e8b501beca96123c48ef9e09b8839e8bce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-w7tcg" Mar 10 02:01:09.635737 kubelet[2820]: E0310 02:01:09.635414 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-w7tcg_kube-system(ff236307-737f-4769-9a73-e0e60d4f60ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-w7tcg_kube-system(ff236307-737f-4769-9a73-e0e60d4f60ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7fdebc4ebe88799f3d03c81209e991e8b501beca96123c48ef9e09b8839e8bce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-w7tcg" podUID="ff236307-737f-4769-9a73-e0e60d4f60ef" Mar 10 02:01:09.638036 containerd[1555]: time="2026-03-10T02:01:09.637695504Z" level=error msg="Failed to destroy network for sandbox \"3cb14bd8c8f422d343e7d25a1cbda329e90e1ee1d48ea2d4822396589abded15\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.658675 containerd[1555]: time="2026-03-10T02:01:09.658587384Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9wb7f,Uid:f5a2970b-ee52-4628-9153-e736ecce6760,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cb14bd8c8f422d343e7d25a1cbda329e90e1ee1d48ea2d4822396589abded15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.660004 kubelet[2820]: E0310 02:01:09.659704 2820 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cb14bd8c8f422d343e7d25a1cbda329e90e1ee1d48ea2d4822396589abded15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.660004 kubelet[2820]: E0310 02:01:09.659788 2820 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cb14bd8c8f422d343e7d25a1cbda329e90e1ee1d48ea2d4822396589abded15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-9wb7f" Mar 10 02:01:09.660004 kubelet[2820]: E0310 02:01:09.659820 2820 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3cb14bd8c8f422d343e7d25a1cbda329e90e1ee1d48ea2d4822396589abded15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-9wb7f" Mar 10 02:01:09.661001 kubelet[2820]: E0310 02:01:09.659876 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-9wb7f_kube-system(f5a2970b-ee52-4628-9153-e736ecce6760)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-9wb7f_kube-system(f5a2970b-ee52-4628-9153-e736ecce6760)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3cb14bd8c8f422d343e7d25a1cbda329e90e1ee1d48ea2d4822396589abded15\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-9wb7f" podUID="f5a2970b-ee52-4628-9153-e736ecce6760" Mar 10 02:01:09.785568 kubelet[2820]: I0310 02:01:09.783980 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-7kz5z" podStartSLOduration=5.63184915 podStartE2EDuration="52.783962089s" podCreationTimestamp="2026-03-10 02:00:17 +0000 UTC" firstStartedPulling="2026-03-10 02:00:18.290679784 +0000 UTC m=+54.729293685" lastFinishedPulling="2026-03-10 02:01:05.442792723 +0000 UTC m=+101.881406624" observedRunningTime="2026-03-10 02:01:09.782980728 +0000 UTC m=+106.221594649" watchObservedRunningTime="2026-03-10 02:01:09.783962089 +0000 UTC m=+106.222576010" Mar 10 02:01:09.815556 containerd[1555]: time="2026-03-10T02:01:09.815337774Z" level=error msg="Failed to destroy network for sandbox \"f8a5878a397a2f1116dad2512da081a0ad02d29a31bd4560c7cc132453e90181\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.822667 containerd[1555]: time="2026-03-10T02:01:09.822634709Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9jkrd,Uid:17109937-e605-4060-a474-8701d718389b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8a5878a397a2f1116dad2512da081a0ad02d29a31bd4560c7cc132453e90181\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.823825 kubelet[2820]: E0310 02:01:09.823231 2820 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8a5878a397a2f1116dad2512da081a0ad02d29a31bd4560c7cc132453e90181\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 10 02:01:09.823825 kubelet[2820]: E0310 02:01:09.823297 2820 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8a5878a397a2f1116dad2512da081a0ad02d29a31bd4560c7cc132453e90181\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9jkrd" Mar 10 02:01:09.823825 kubelet[2820]: E0310 02:01:09.823324 2820 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8a5878a397a2f1116dad2512da081a0ad02d29a31bd4560c7cc132453e90181\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9jkrd" Mar 10 02:01:09.823973 kubelet[2820]: E0310 02:01:09.823775 2820 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9jkrd_calico-system(17109937-e605-4060-a474-8701d718389b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9jkrd_calico-system(17109937-e605-4060-a474-8701d718389b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f8a5878a397a2f1116dad2512da081a0ad02d29a31bd4560c7cc132453e90181\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9jkrd" podUID="17109937-e605-4060-a474-8701d718389b" Mar 10 02:01:10.062566 systemd[1]: run-netns-cni\x2d94338b2e\x2d70d9\x2dfb87\x2d4d4a\x2d63541d990abc.mount: Deactivated successfully. Mar 10 02:01:10.062731 systemd[1]: run-netns-cni\x2dff746420\x2d3b24\x2d05a8\x2d07b3\x2dcab519ff88fc.mount: Deactivated successfully. Mar 10 02:01:10.062849 systemd[1]: run-netns-cni\x2db648acd7\x2d89db\x2d1707\x2dc644\x2d7c52b660b65a.mount: Deactivated successfully. Mar 10 02:01:10.062959 systemd[1]: run-netns-cni\x2d76de2ef0\x2d561c\x2d0b85\x2d0393\x2d1fcc61d739d0.mount: Deactivated successfully. Mar 10 02:01:10.063049 systemd[1]: run-netns-cni\x2d7782361c\x2d9f8c\x2d49ed\x2dcfeb\x2d76ccb3f37ba6.mount: Deactivated successfully. Mar 10 02:01:10.063119 systemd[1]: run-netns-cni\x2d5e09d18c\x2d5599\x2d9f15\x2d44b1\x2d71cac2133bfd.mount: Deactivated successfully. Mar 10 02:01:10.063180 systemd[1]: run-netns-cni\x2de6c21979\x2de332\x2d40c5\x2d5b6d\x2d84730bb8965a.mount: Deactivated successfully. Mar 10 02:01:11.301932 kubelet[2820]: I0310 02:01:11.298768 2820 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-nginx-config\") pod \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\" (UID: \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\") " Mar 10 02:01:11.301932 kubelet[2820]: I0310 02:01:11.298870 2820 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zzcd\" (UniqueName: \"kubernetes.io/projected/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-kube-api-access-4zzcd\") pod \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\" (UID: \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\") " Mar 10 02:01:11.301932 kubelet[2820]: I0310 02:01:11.298918 2820 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-whisker-backend-key-pair\") pod \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\" (UID: \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\") " Mar 10 02:01:11.301932 kubelet[2820]: I0310 02:01:11.298947 2820 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-whisker-ca-bundle\") pod \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\" (UID: \"37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed\") " Mar 10 02:01:11.301932 kubelet[2820]: I0310 02:01:11.299785 2820 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed" (UID: "37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 10 02:01:11.302877 kubelet[2820]: I0310 02:01:11.300164 2820 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed" (UID: "37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 10 02:01:11.326632 systemd[1]: var-lib-kubelet-pods-37a20b0e\x2de8bd\x2d4481\x2d9ea0\x2dd4c45cd1e1ed-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d4zzcd.mount: Deactivated successfully. Mar 10 02:01:11.338721 kubelet[2820]: I0310 02:01:11.332854 2820 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-kube-api-access-4zzcd" (OuterVolumeSpecName: "kube-api-access-4zzcd") pod "37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed" (UID: "37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed"). InnerVolumeSpecName "kube-api-access-4zzcd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 10 02:01:11.342335 kubelet[2820]: I0310 02:01:11.342266 2820 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed" (UID: "37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 10 02:01:11.351137 systemd[1]: var-lib-kubelet-pods-37a20b0e\x2de8bd\x2d4481\x2d9ea0\x2dd4c45cd1e1ed-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 10 02:01:11.400425 kubelet[2820]: I0310 02:01:11.400336 2820 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Mar 10 02:01:11.400425 kubelet[2820]: I0310 02:01:11.400373 2820 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 10 02:01:11.400425 kubelet[2820]: I0310 02:01:11.400384 2820 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-nginx-config\") on node \"localhost\" DevicePath \"\"" Mar 10 02:01:11.400425 kubelet[2820]: I0310 02:01:11.400395 2820 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4zzcd\" (UniqueName: \"kubernetes.io/projected/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed-kube-api-access-4zzcd\") on node \"localhost\" DevicePath \"\"" Mar 10 02:01:11.698106 systemd[1]: Removed slice kubepods-besteffort-pod37a20b0e_e8bd_4481_9ea0_d4c45cd1e1ed.slice - libcontainer container kubepods-besteffort-pod37a20b0e_e8bd_4481_9ea0_d4c45cd1e1ed.slice. Mar 10 02:01:12.234965 kubelet[2820]: I0310 02:01:12.234866 2820 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed" path="/var/lib/kubelet/pods/37a20b0e-e8bd-4481-9ea0-d4c45cd1e1ed/volumes" Mar 10 02:01:12.294856 systemd[1]: Created slice kubepods-besteffort-podf7ca28b5_d80c_43f9_a4d7_5c371e643c14.slice - libcontainer container kubepods-besteffort-podf7ca28b5_d80c_43f9_a4d7_5c371e643c14.slice. Mar 10 02:01:12.321524 kubelet[2820]: I0310 02:01:12.320555 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42tw6\" (UniqueName: \"kubernetes.io/projected/f7ca28b5-d80c-43f9-a4d7-5c371e643c14-kube-api-access-42tw6\") pod \"whisker-78646d5dd6-ng95t\" (UID: \"f7ca28b5-d80c-43f9-a4d7-5c371e643c14\") " pod="calico-system/whisker-78646d5dd6-ng95t" Mar 10 02:01:12.321524 kubelet[2820]: I0310 02:01:12.320952 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f7ca28b5-d80c-43f9-a4d7-5c371e643c14-whisker-backend-key-pair\") pod \"whisker-78646d5dd6-ng95t\" (UID: \"f7ca28b5-d80c-43f9-a4d7-5c371e643c14\") " pod="calico-system/whisker-78646d5dd6-ng95t" Mar 10 02:01:12.321524 kubelet[2820]: I0310 02:01:12.321196 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/f7ca28b5-d80c-43f9-a4d7-5c371e643c14-nginx-config\") pod \"whisker-78646d5dd6-ng95t\" (UID: \"f7ca28b5-d80c-43f9-a4d7-5c371e643c14\") " pod="calico-system/whisker-78646d5dd6-ng95t" Mar 10 02:01:12.321524 kubelet[2820]: I0310 02:01:12.321235 2820 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7ca28b5-d80c-43f9-a4d7-5c371e643c14-whisker-ca-bundle\") pod \"whisker-78646d5dd6-ng95t\" (UID: \"f7ca28b5-d80c-43f9-a4d7-5c371e643c14\") " pod="calico-system/whisker-78646d5dd6-ng95t" Mar 10 02:01:12.635215 containerd[1555]: time="2026-03-10T02:01:12.633813452Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78646d5dd6-ng95t,Uid:f7ca28b5-d80c-43f9-a4d7-5c371e643c14,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:13.445141 systemd-networkd[1476]: cali06f08604064: Link UP Mar 10 02:01:13.446188 systemd-networkd[1476]: cali06f08604064: Gained carrier Mar 10 02:01:13.552111 containerd[1555]: 2026-03-10 02:01:12.762 [ERROR][4085] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 10 02:01:13.552111 containerd[1555]: 2026-03-10 02:01:12.904 [INFO][4085] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--78646d5dd6--ng95t-eth0 whisker-78646d5dd6- calico-system f7ca28b5-d80c-43f9-a4d7-5c371e643c14 1080 0 2026-03-10 02:01:12 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:78646d5dd6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-78646d5dd6-ng95t eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali06f08604064 [] [] }} ContainerID="e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" Namespace="calico-system" Pod="whisker-78646d5dd6-ng95t" WorkloadEndpoint="localhost-k8s-whisker--78646d5dd6--ng95t-" Mar 10 02:01:13.552111 containerd[1555]: 2026-03-10 02:01:12.905 [INFO][4085] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" Namespace="calico-system" Pod="whisker-78646d5dd6-ng95t" WorkloadEndpoint="localhost-k8s-whisker--78646d5dd6--ng95t-eth0" Mar 10 02:01:13.552111 containerd[1555]: 2026-03-10 02:01:13.077 [INFO][4095] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" HandleID="k8s-pod-network.e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" Workload="localhost-k8s-whisker--78646d5dd6--ng95t-eth0" Mar 10 02:01:13.552543 containerd[1555]: 2026-03-10 02:01:13.127 [INFO][4095] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" HandleID="k8s-pod-network.e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" Workload="localhost-k8s-whisker--78646d5dd6--ng95t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fdda0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-78646d5dd6-ng95t", "timestamp":"2026-03-10 02:01:13.077108394 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0005e6160)} Mar 10 02:01:13.552543 containerd[1555]: 2026-03-10 02:01:13.127 [INFO][4095] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 10 02:01:13.552543 containerd[1555]: 2026-03-10 02:01:13.127 [INFO][4095] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 10 02:01:13.552543 containerd[1555]: 2026-03-10 02:01:13.127 [INFO][4095] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 10 02:01:13.552543 containerd[1555]: 2026-03-10 02:01:13.160 [INFO][4095] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" host="localhost" Mar 10 02:01:13.552543 containerd[1555]: 2026-03-10 02:01:13.205 [INFO][4095] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 10 02:01:13.552543 containerd[1555]: 2026-03-10 02:01:13.236 [INFO][4095] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 10 02:01:13.552543 containerd[1555]: 2026-03-10 02:01:13.250 [INFO][4095] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:13.552543 containerd[1555]: 2026-03-10 02:01:13.268 [INFO][4095] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:13.552543 containerd[1555]: 2026-03-10 02:01:13.268 [INFO][4095] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" host="localhost" Mar 10 02:01:13.552907 containerd[1555]: 2026-03-10 02:01:13.283 [INFO][4095] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b Mar 10 02:01:13.552907 containerd[1555]: 2026-03-10 02:01:13.307 [INFO][4095] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" host="localhost" Mar 10 02:01:13.552907 containerd[1555]: 2026-03-10 02:01:13.344 [INFO][4095] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" host="localhost" Mar 10 02:01:13.552907 containerd[1555]: 2026-03-10 02:01:13.345 [INFO][4095] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" host="localhost" Mar 10 02:01:13.552907 containerd[1555]: 2026-03-10 02:01:13.345 [INFO][4095] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 10 02:01:13.552907 containerd[1555]: 2026-03-10 02:01:13.345 [INFO][4095] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" HandleID="k8s-pod-network.e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" Workload="localhost-k8s-whisker--78646d5dd6--ng95t-eth0" Mar 10 02:01:13.553144 containerd[1555]: 2026-03-10 02:01:13.381 [INFO][4085] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" Namespace="calico-system" Pod="whisker-78646d5dd6-ng95t" WorkloadEndpoint="localhost-k8s-whisker--78646d5dd6--ng95t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--78646d5dd6--ng95t-eth0", GenerateName:"whisker-78646d5dd6-", Namespace:"calico-system", SelfLink:"", UID:"f7ca28b5-d80c-43f9-a4d7-5c371e643c14", ResourceVersion:"1080", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 1, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78646d5dd6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-78646d5dd6-ng95t", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali06f08604064", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:13.553144 containerd[1555]: 2026-03-10 02:01:13.381 [INFO][4085] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" Namespace="calico-system" Pod="whisker-78646d5dd6-ng95t" WorkloadEndpoint="localhost-k8s-whisker--78646d5dd6--ng95t-eth0" Mar 10 02:01:13.553308 containerd[1555]: 2026-03-10 02:01:13.381 [INFO][4085] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali06f08604064 ContainerID="e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" Namespace="calico-system" Pod="whisker-78646d5dd6-ng95t" WorkloadEndpoint="localhost-k8s-whisker--78646d5dd6--ng95t-eth0" Mar 10 02:01:13.553308 containerd[1555]: 2026-03-10 02:01:13.447 [INFO][4085] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" Namespace="calico-system" Pod="whisker-78646d5dd6-ng95t" WorkloadEndpoint="localhost-k8s-whisker--78646d5dd6--ng95t-eth0" Mar 10 02:01:13.553370 containerd[1555]: 2026-03-10 02:01:13.449 [INFO][4085] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" Namespace="calico-system" Pod="whisker-78646d5dd6-ng95t" WorkloadEndpoint="localhost-k8s-whisker--78646d5dd6--ng95t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--78646d5dd6--ng95t-eth0", GenerateName:"whisker-78646d5dd6-", Namespace:"calico-system", SelfLink:"", UID:"f7ca28b5-d80c-43f9-a4d7-5c371e643c14", ResourceVersion:"1080", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 1, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78646d5dd6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b", Pod:"whisker-78646d5dd6-ng95t", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali06f08604064", MAC:"0e:e9:ed:01:d9:2a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:13.563352 containerd[1555]: 2026-03-10 02:01:13.501 [INFO][4085] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" Namespace="calico-system" Pod="whisker-78646d5dd6-ng95t" WorkloadEndpoint="localhost-k8s-whisker--78646d5dd6--ng95t-eth0" Mar 10 02:01:13.839053 containerd[1555]: time="2026-03-10T02:01:13.838658723Z" level=info msg="connecting to shim e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b" address="unix:///run/containerd/s/8a32510237678b720b07d89be2ce1b5ea68099be95752a4fc3bd32e471782888" namespace=k8s.io protocol=ttrpc version=3 Mar 10 02:01:13.962701 systemd[1]: Started cri-containerd-e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b.scope - libcontainer container e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b. Mar 10 02:01:14.046844 systemd-resolved[1481]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 10 02:01:14.259093 containerd[1555]: time="2026-03-10T02:01:14.258189386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78646d5dd6-ng95t,Uid:f7ca28b5-d80c-43f9-a4d7-5c371e643c14,Namespace:calico-system,Attempt:0,} returns sandbox id \"e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b\"" Mar 10 02:01:14.264189 containerd[1555]: time="2026-03-10T02:01:14.261420599Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 10 02:01:15.417979 systemd-networkd[1476]: cali06f08604064: Gained IPv6LL Mar 10 02:01:15.853959 containerd[1555]: time="2026-03-10T02:01:15.852948287Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:15.854532 containerd[1555]: time="2026-03-10T02:01:15.854400918Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 10 02:01:15.859557 containerd[1555]: time="2026-03-10T02:01:15.857036199Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:15.879100 containerd[1555]: time="2026-03-10T02:01:15.876150759Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:15.879100 containerd[1555]: time="2026-03-10T02:01:15.878651140Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.617050021s" Mar 10 02:01:15.879100 containerd[1555]: time="2026-03-10T02:01:15.878689739Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 10 02:01:15.901710 containerd[1555]: time="2026-03-10T02:01:15.901123674Z" level=info msg="CreateContainer within sandbox \"e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 10 02:01:15.956435 containerd[1555]: time="2026-03-10T02:01:15.955016345Z" level=info msg="Container 34ea81a0b37e9751128eeffd1911590cd7c402f920fb311b68425a32c7bb1657: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:01:16.009532 containerd[1555]: time="2026-03-10T02:01:16.009245588Z" level=info msg="CreateContainer within sandbox \"e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"34ea81a0b37e9751128eeffd1911590cd7c402f920fb311b68425a32c7bb1657\"" Mar 10 02:01:16.032896 containerd[1555]: time="2026-03-10T02:01:16.013366056Z" level=info msg="StartContainer for \"34ea81a0b37e9751128eeffd1911590cd7c402f920fb311b68425a32c7bb1657\"" Mar 10 02:01:16.032896 containerd[1555]: time="2026-03-10T02:01:16.028265129Z" level=info msg="connecting to shim 34ea81a0b37e9751128eeffd1911590cd7c402f920fb311b68425a32c7bb1657" address="unix:///run/containerd/s/8a32510237678b720b07d89be2ce1b5ea68099be95752a4fc3bd32e471782888" protocol=ttrpc version=3 Mar 10 02:01:16.213915 systemd[1]: Started cri-containerd-34ea81a0b37e9751128eeffd1911590cd7c402f920fb311b68425a32c7bb1657.scope - libcontainer container 34ea81a0b37e9751128eeffd1911590cd7c402f920fb311b68425a32c7bb1657. Mar 10 02:01:16.731694 systemd-networkd[1476]: vxlan.calico: Link UP Mar 10 02:01:16.731708 systemd-networkd[1476]: vxlan.calico: Gained carrier Mar 10 02:01:16.765575 containerd[1555]: time="2026-03-10T02:01:16.765405210Z" level=info msg="StartContainer for \"34ea81a0b37e9751128eeffd1911590cd7c402f920fb311b68425a32c7bb1657\" returns successfully" Mar 10 02:01:16.770133 containerd[1555]: time="2026-03-10T02:01:16.770021659Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 10 02:01:18.183889 systemd-networkd[1476]: vxlan.calico: Gained IPv6LL Mar 10 02:01:20.041869 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount177875034.mount: Deactivated successfully. Mar 10 02:01:20.121428 containerd[1555]: time="2026-03-10T02:01:20.121095829Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:20.124666 containerd[1555]: time="2026-03-10T02:01:20.124326098Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 10 02:01:20.137162 containerd[1555]: time="2026-03-10T02:01:20.136798206Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:20.145531 containerd[1555]: time="2026-03-10T02:01:20.145162887Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:20.151748 containerd[1555]: time="2026-03-10T02:01:20.151704828Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 3.381581317s" Mar 10 02:01:20.152730 containerd[1555]: time="2026-03-10T02:01:20.152600497Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 10 02:01:20.165031 containerd[1555]: time="2026-03-10T02:01:20.164068710Z" level=info msg="CreateContainer within sandbox \"e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 10 02:01:20.194137 containerd[1555]: time="2026-03-10T02:01:20.194042478Z" level=info msg="Container 1bb04caf7c7f970064abeb4c77dc3799ba52099f1b06802cc8c8c0dd9e3fde4e: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:01:20.220098 containerd[1555]: time="2026-03-10T02:01:20.219917707Z" level=info msg="CreateContainer within sandbox \"e0ec50a13c37e8ab73656a4de14ba0a7c3e2e9657f8fe80df9db9534fc2ad92b\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1bb04caf7c7f970064abeb4c77dc3799ba52099f1b06802cc8c8c0dd9e3fde4e\"" Mar 10 02:01:20.221531 containerd[1555]: time="2026-03-10T02:01:20.221394778Z" level=info msg="StartContainer for \"1bb04caf7c7f970064abeb4c77dc3799ba52099f1b06802cc8c8c0dd9e3fde4e\"" Mar 10 02:01:20.224333 containerd[1555]: time="2026-03-10T02:01:20.224247684Z" level=info msg="connecting to shim 1bb04caf7c7f970064abeb4c77dc3799ba52099f1b06802cc8c8c0dd9e3fde4e" address="unix:///run/containerd/s/8a32510237678b720b07d89be2ce1b5ea68099be95752a4fc3bd32e471782888" protocol=ttrpc version=3 Mar 10 02:01:20.286325 systemd[1]: Started cri-containerd-1bb04caf7c7f970064abeb4c77dc3799ba52099f1b06802cc8c8c0dd9e3fde4e.scope - libcontainer container 1bb04caf7c7f970064abeb4c77dc3799ba52099f1b06802cc8c8c0dd9e3fde4e. Mar 10 02:01:20.410498 containerd[1555]: time="2026-03-10T02:01:20.410369006Z" level=info msg="StartContainer for \"1bb04caf7c7f970064abeb4c77dc3799ba52099f1b06802cc8c8c0dd9e3fde4e\" returns successfully" Mar 10 02:01:20.937674 kubelet[2820]: I0310 02:01:20.934733 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-78646d5dd6-ng95t" podStartSLOduration=3.041017829 podStartE2EDuration="8.934707188s" podCreationTimestamp="2026-03-10 02:01:12 +0000 UTC" firstStartedPulling="2026-03-10 02:01:14.260640825 +0000 UTC m=+110.699254736" lastFinishedPulling="2026-03-10 02:01:20.154330194 +0000 UTC m=+116.592944095" observedRunningTime="2026-03-10 02:01:20.925775374 +0000 UTC m=+117.364389296" watchObservedRunningTime="2026-03-10 02:01:20.934707188 +0000 UTC m=+117.373321099" Mar 10 02:01:21.254295 containerd[1555]: time="2026-03-10T02:01:21.254119655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76dfd768bc-pv7bx,Uid:072ce4e7-1021-481e-a47a-99c585261923,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:22.049231 systemd-networkd[1476]: cali1a53162d29c: Link UP Mar 10 02:01:22.051115 systemd-networkd[1476]: cali1a53162d29c: Gained carrier Mar 10 02:01:22.101542 containerd[1555]: 2026-03-10 02:01:21.500 [INFO][4463] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0 calico-apiserver-76dfd768bc- calico-system 072ce4e7-1021-481e-a47a-99c585261923 1025 0 2026-03-10 02:00:15 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76dfd768bc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-76dfd768bc-pv7bx eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali1a53162d29c [] [] }} ContainerID="835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-pv7bx" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-" Mar 10 02:01:22.101542 containerd[1555]: 2026-03-10 02:01:21.501 [INFO][4463] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-pv7bx" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0" Mar 10 02:01:22.101542 containerd[1555]: 2026-03-10 02:01:21.798 [INFO][4477] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" HandleID="k8s-pod-network.835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" Workload="localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0" Mar 10 02:01:22.101959 containerd[1555]: 2026-03-10 02:01:21.845 [INFO][4477] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" HandleID="k8s-pod-network.835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" Workload="localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003dc430), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-apiserver-76dfd768bc-pv7bx", "timestamp":"2026-03-10 02:01:21.798628432 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000198dc0)} Mar 10 02:01:22.101959 containerd[1555]: 2026-03-10 02:01:21.845 [INFO][4477] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 10 02:01:22.101959 containerd[1555]: 2026-03-10 02:01:21.846 [INFO][4477] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 10 02:01:22.101959 containerd[1555]: 2026-03-10 02:01:21.846 [INFO][4477] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 10 02:01:22.101959 containerd[1555]: 2026-03-10 02:01:21.865 [INFO][4477] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" host="localhost" Mar 10 02:01:22.101959 containerd[1555]: 2026-03-10 02:01:21.902 [INFO][4477] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 10 02:01:22.101959 containerd[1555]: 2026-03-10 02:01:21.930 [INFO][4477] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 10 02:01:22.101959 containerd[1555]: 2026-03-10 02:01:21.937 [INFO][4477] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:22.101959 containerd[1555]: 2026-03-10 02:01:21.947 [INFO][4477] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:22.101959 containerd[1555]: 2026-03-10 02:01:21.948 [INFO][4477] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" host="localhost" Mar 10 02:01:22.102541 containerd[1555]: 2026-03-10 02:01:21.955 [INFO][4477] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850 Mar 10 02:01:22.102541 containerd[1555]: 2026-03-10 02:01:21.990 [INFO][4477] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" host="localhost" Mar 10 02:01:22.102541 containerd[1555]: 2026-03-10 02:01:22.028 [INFO][4477] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" host="localhost" Mar 10 02:01:22.102541 containerd[1555]: 2026-03-10 02:01:22.028 [INFO][4477] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" host="localhost" Mar 10 02:01:22.102541 containerd[1555]: 2026-03-10 02:01:22.028 [INFO][4477] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 10 02:01:22.102541 containerd[1555]: 2026-03-10 02:01:22.028 [INFO][4477] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" HandleID="k8s-pod-network.835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" Workload="localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0" Mar 10 02:01:22.107593 containerd[1555]: 2026-03-10 02:01:22.036 [INFO][4463] cni-plugin/k8s.go 418: Populated endpoint ContainerID="835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-pv7bx" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0", GenerateName:"calico-apiserver-76dfd768bc-", Namespace:"calico-system", SelfLink:"", UID:"072ce4e7-1021-481e-a47a-99c585261923", ResourceVersion:"1025", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 0, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76dfd768bc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-76dfd768bc-pv7bx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1a53162d29c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:22.107738 containerd[1555]: 2026-03-10 02:01:22.036 [INFO][4463] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-pv7bx" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0" Mar 10 02:01:22.107738 containerd[1555]: 2026-03-10 02:01:22.036 [INFO][4463] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1a53162d29c ContainerID="835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-pv7bx" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0" Mar 10 02:01:22.107738 containerd[1555]: 2026-03-10 02:01:22.048 [INFO][4463] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-pv7bx" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0" Mar 10 02:01:22.107909 containerd[1555]: 2026-03-10 02:01:22.052 [INFO][4463] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-pv7bx" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0", GenerateName:"calico-apiserver-76dfd768bc-", Namespace:"calico-system", SelfLink:"", UID:"072ce4e7-1021-481e-a47a-99c585261923", ResourceVersion:"1025", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 0, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76dfd768bc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850", Pod:"calico-apiserver-76dfd768bc-pv7bx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1a53162d29c", MAC:"d2:b7:92:99:6b:e7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:22.108024 containerd[1555]: 2026-03-10 02:01:22.093 [INFO][4463] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-pv7bx" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--pv7bx-eth0" Mar 10 02:01:22.234025 containerd[1555]: time="2026-03-10T02:01:22.233980203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-w7tcg,Uid:ff236307-737f-4769-9a73-e0e60d4f60ef,Namespace:kube-system,Attempt:0,}" Mar 10 02:01:22.255198 containerd[1555]: time="2026-03-10T02:01:22.255101101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6c6bb9bb-bm4js,Uid:74613752-983e-41dc-b1b0-4ff0b2a9333f,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:22.286242 containerd[1555]: time="2026-03-10T02:01:22.286189218Z" level=info msg="connecting to shim 835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850" address="unix:///run/containerd/s/4467b9c8f50c41bdf15456e0b50b79296f7cc1da228e4a257274d074f3abcf3f" namespace=k8s.io protocol=ttrpc version=3 Mar 10 02:01:22.450723 systemd[1]: Started cri-containerd-835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850.scope - libcontainer container 835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850. Mar 10 02:01:22.568347 systemd-resolved[1481]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 10 02:01:22.757759 containerd[1555]: time="2026-03-10T02:01:22.754767072Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76dfd768bc-pv7bx,Uid:072ce4e7-1021-481e-a47a-99c585261923,Namespace:calico-system,Attempt:0,} returns sandbox id \"835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850\"" Mar 10 02:01:22.763234 containerd[1555]: time="2026-03-10T02:01:22.762889597Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 10 02:01:23.180853 systemd-networkd[1476]: calibe458655065: Link UP Mar 10 02:01:23.182741 systemd-networkd[1476]: calibe458655065: Gained carrier Mar 10 02:01:23.232438 containerd[1555]: time="2026-03-10T02:01:23.232058218Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9jkrd,Uid:17109937-e605-4060-a474-8701d718389b,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:23.250929 containerd[1555]: time="2026-03-10T02:01:23.250821899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9wb7f,Uid:f5a2970b-ee52-4628-9153-e736ecce6760,Namespace:kube-system,Attempt:0,}" Mar 10 02:01:23.433249 containerd[1555]: 2026-03-10 02:01:22.555 [INFO][4522] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0 calico-kube-controllers-d6c6bb9bb- calico-system 74613752-983e-41dc-b1b0-4ff0b2a9333f 1014 0 2026-03-10 02:00:17 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:d6c6bb9bb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-d6c6bb9bb-bm4js eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calibe458655065 [] [] }} ContainerID="c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" Namespace="calico-system" Pod="calico-kube-controllers-d6c6bb9bb-bm4js" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-" Mar 10 02:01:23.433249 containerd[1555]: 2026-03-10 02:01:22.556 [INFO][4522] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" Namespace="calico-system" Pod="calico-kube-controllers-d6c6bb9bb-bm4js" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0" Mar 10 02:01:23.433249 containerd[1555]: 2026-03-10 02:01:22.787 [INFO][4580] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" HandleID="k8s-pod-network.c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" Workload="localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0" Mar 10 02:01:23.433954 containerd[1555]: 2026-03-10 02:01:22.833 [INFO][4580] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" HandleID="k8s-pod-network.c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" Workload="localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000790120), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-d6c6bb9bb-bm4js", "timestamp":"2026-03-10 02:01:22.787029957 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0006b0b00)} Mar 10 02:01:23.433954 containerd[1555]: 2026-03-10 02:01:22.833 [INFO][4580] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 10 02:01:23.433954 containerd[1555]: 2026-03-10 02:01:22.833 [INFO][4580] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 10 02:01:23.433954 containerd[1555]: 2026-03-10 02:01:22.833 [INFO][4580] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 10 02:01:23.433954 containerd[1555]: 2026-03-10 02:01:22.897 [INFO][4580] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" host="localhost" Mar 10 02:01:23.433954 containerd[1555]: 2026-03-10 02:01:22.946 [INFO][4580] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 10 02:01:23.433954 containerd[1555]: 2026-03-10 02:01:22.993 [INFO][4580] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 10 02:01:23.433954 containerd[1555]: 2026-03-10 02:01:23.008 [INFO][4580] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:23.433954 containerd[1555]: 2026-03-10 02:01:23.026 [INFO][4580] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:23.434293 containerd[1555]: 2026-03-10 02:01:23.027 [INFO][4580] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" host="localhost" Mar 10 02:01:23.434293 containerd[1555]: 2026-03-10 02:01:23.042 [INFO][4580] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673 Mar 10 02:01:23.434293 containerd[1555]: 2026-03-10 02:01:23.084 [INFO][4580] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" host="localhost" Mar 10 02:01:23.434293 containerd[1555]: 2026-03-10 02:01:23.141 [INFO][4580] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" host="localhost" Mar 10 02:01:23.434293 containerd[1555]: 2026-03-10 02:01:23.142 [INFO][4580] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" host="localhost" Mar 10 02:01:23.434293 containerd[1555]: 2026-03-10 02:01:23.142 [INFO][4580] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 10 02:01:23.434293 containerd[1555]: 2026-03-10 02:01:23.142 [INFO][4580] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" HandleID="k8s-pod-network.c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" Workload="localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0" Mar 10 02:01:23.435805 containerd[1555]: 2026-03-10 02:01:23.159 [INFO][4522] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" Namespace="calico-system" Pod="calico-kube-controllers-d6c6bb9bb-bm4js" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0", GenerateName:"calico-kube-controllers-d6c6bb9bb-", Namespace:"calico-system", SelfLink:"", UID:"74613752-983e-41dc-b1b0-4ff0b2a9333f", ResourceVersion:"1014", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 0, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"d6c6bb9bb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-d6c6bb9bb-bm4js", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibe458655065", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:23.435946 containerd[1555]: 2026-03-10 02:01:23.159 [INFO][4522] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" Namespace="calico-system" Pod="calico-kube-controllers-d6c6bb9bb-bm4js" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0" Mar 10 02:01:23.435946 containerd[1555]: 2026-03-10 02:01:23.159 [INFO][4522] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibe458655065 ContainerID="c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" Namespace="calico-system" Pod="calico-kube-controllers-d6c6bb9bb-bm4js" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0" Mar 10 02:01:23.435946 containerd[1555]: 2026-03-10 02:01:23.190 [INFO][4522] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" Namespace="calico-system" Pod="calico-kube-controllers-d6c6bb9bb-bm4js" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0" Mar 10 02:01:23.436081 containerd[1555]: 2026-03-10 02:01:23.196 [INFO][4522] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" Namespace="calico-system" Pod="calico-kube-controllers-d6c6bb9bb-bm4js" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0", GenerateName:"calico-kube-controllers-d6c6bb9bb-", Namespace:"calico-system", SelfLink:"", UID:"74613752-983e-41dc-b1b0-4ff0b2a9333f", ResourceVersion:"1014", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 0, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"d6c6bb9bb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673", Pod:"calico-kube-controllers-d6c6bb9bb-bm4js", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibe458655065", MAC:"92:05:07:ed:2c:18", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:23.436251 containerd[1555]: 2026-03-10 02:01:23.379 [INFO][4522] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" Namespace="calico-system" Pod="calico-kube-controllers-d6c6bb9bb-bm4js" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--d6c6bb9bb--bm4js-eth0" Mar 10 02:01:23.535904 systemd-networkd[1476]: calif7a6fe6202a: Link UP Mar 10 02:01:23.539126 systemd-networkd[1476]: calif7a6fe6202a: Gained carrier Mar 10 02:01:23.615592 systemd-networkd[1476]: cali1a53162d29c: Gained IPv6LL Mar 10 02:01:23.652574 containerd[1555]: 2026-03-10 02:01:22.558 [INFO][4521] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--66bc5c9577--w7tcg-eth0 coredns-66bc5c9577- kube-system ff236307-737f-4769-9a73-e0e60d4f60ef 1024 0 2026-03-10 01:59:27 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-66bc5c9577-w7tcg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif7a6fe6202a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" Namespace="kube-system" Pod="coredns-66bc5c9577-w7tcg" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--w7tcg-" Mar 10 02:01:23.652574 containerd[1555]: 2026-03-10 02:01:22.559 [INFO][4521] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" Namespace="kube-system" Pod="coredns-66bc5c9577-w7tcg" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--w7tcg-eth0" Mar 10 02:01:23.652574 containerd[1555]: 2026-03-10 02:01:22.858 [INFO][4574] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" HandleID="k8s-pod-network.00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" Workload="localhost-k8s-coredns--66bc5c9577--w7tcg-eth0" Mar 10 02:01:23.657718 containerd[1555]: 2026-03-10 02:01:22.920 [INFO][4574] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" HandleID="k8s-pod-network.00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" Workload="localhost-k8s-coredns--66bc5c9577--w7tcg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00011ad90), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-66bc5c9577-w7tcg", "timestamp":"2026-03-10 02:01:22.858727274 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003b66e0)} Mar 10 02:01:23.657718 containerd[1555]: 2026-03-10 02:01:22.921 [INFO][4574] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 10 02:01:23.657718 containerd[1555]: 2026-03-10 02:01:23.142 [INFO][4574] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 10 02:01:23.657718 containerd[1555]: 2026-03-10 02:01:23.148 [INFO][4574] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 10 02:01:23.657718 containerd[1555]: 2026-03-10 02:01:23.188 [INFO][4574] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" host="localhost" Mar 10 02:01:23.657718 containerd[1555]: 2026-03-10 02:01:23.256 [INFO][4574] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 10 02:01:23.657718 containerd[1555]: 2026-03-10 02:01:23.314 [INFO][4574] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 10 02:01:23.657718 containerd[1555]: 2026-03-10 02:01:23.333 [INFO][4574] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:23.657718 containerd[1555]: 2026-03-10 02:01:23.372 [INFO][4574] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:23.657718 containerd[1555]: 2026-03-10 02:01:23.376 [INFO][4574] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" host="localhost" Mar 10 02:01:23.658265 containerd[1555]: 2026-03-10 02:01:23.397 [INFO][4574] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b Mar 10 02:01:23.658265 containerd[1555]: 2026-03-10 02:01:23.432 [INFO][4574] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" host="localhost" Mar 10 02:01:23.658265 containerd[1555]: 2026-03-10 02:01:23.479 [INFO][4574] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" host="localhost" Mar 10 02:01:23.658265 containerd[1555]: 2026-03-10 02:01:23.479 [INFO][4574] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" host="localhost" Mar 10 02:01:23.658265 containerd[1555]: 2026-03-10 02:01:23.479 [INFO][4574] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 10 02:01:23.658265 containerd[1555]: 2026-03-10 02:01:23.479 [INFO][4574] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" HandleID="k8s-pod-network.00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" Workload="localhost-k8s-coredns--66bc5c9577--w7tcg-eth0" Mar 10 02:01:23.658545 containerd[1555]: 2026-03-10 02:01:23.514 [INFO][4521] cni-plugin/k8s.go 418: Populated endpoint ContainerID="00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" Namespace="kube-system" Pod="coredns-66bc5c9577-w7tcg" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--w7tcg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--w7tcg-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"ff236307-737f-4769-9a73-e0e60d4f60ef", ResourceVersion:"1024", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 1, 59, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-66bc5c9577-w7tcg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif7a6fe6202a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:23.658545 containerd[1555]: 2026-03-10 02:01:23.518 [INFO][4521] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" Namespace="kube-system" Pod="coredns-66bc5c9577-w7tcg" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--w7tcg-eth0" Mar 10 02:01:23.658545 containerd[1555]: 2026-03-10 02:01:23.519 [INFO][4521] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif7a6fe6202a ContainerID="00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" Namespace="kube-system" Pod="coredns-66bc5c9577-w7tcg" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--w7tcg-eth0" Mar 10 02:01:23.658545 containerd[1555]: 2026-03-10 02:01:23.541 [INFO][4521] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" Namespace="kube-system" Pod="coredns-66bc5c9577-w7tcg" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--w7tcg-eth0" Mar 10 02:01:23.658545 containerd[1555]: 2026-03-10 02:01:23.545 [INFO][4521] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" Namespace="kube-system" Pod="coredns-66bc5c9577-w7tcg" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--w7tcg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--w7tcg-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"ff236307-737f-4769-9a73-e0e60d4f60ef", ResourceVersion:"1024", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 1, 59, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b", Pod:"coredns-66bc5c9577-w7tcg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif7a6fe6202a", MAC:"32:eb:71:9e:2a:32", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:23.658545 containerd[1555]: 2026-03-10 02:01:23.633 [INFO][4521] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" Namespace="kube-system" Pod="coredns-66bc5c9577-w7tcg" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--w7tcg-eth0" Mar 10 02:01:23.798646 containerd[1555]: time="2026-03-10T02:01:23.794002212Z" level=info msg="connecting to shim c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673" address="unix:///run/containerd/s/a09e6682ed37611db0d3f174897ef8f0bb92a57542c955609970a321e7d9092c" namespace=k8s.io protocol=ttrpc version=3 Mar 10 02:01:23.908680 containerd[1555]: time="2026-03-10T02:01:23.907561966Z" level=info msg="connecting to shim 00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b" address="unix:///run/containerd/s/996f1ca71ec4e4541b17b4914857a434c6b6f7949e829500d5d88924f4814ab6" namespace=k8s.io protocol=ttrpc version=3 Mar 10 02:01:24.086207 systemd[1]: Started cri-containerd-00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b.scope - libcontainer container 00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b. Mar 10 02:01:24.139192 systemd[1]: Started cri-containerd-c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673.scope - libcontainer container c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673. Mar 10 02:01:24.181746 systemd-resolved[1481]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 10 02:01:24.337534 containerd[1555]: time="2026-03-10T02:01:24.336893265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76dfd768bc-9l6p8,Uid:a0d4a605-efd0-4424-86bb-3dd6bae830b7,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:24.352766 containerd[1555]: time="2026-03-10T02:01:24.351917678Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-kt2sm,Uid:c169f044-feb9-4347-930b-a968995715e2,Namespace:calico-system,Attempt:0,}" Mar 10 02:01:24.414196 systemd-networkd[1476]: cali5b409cd6ec0: Link UP Mar 10 02:01:24.414759 systemd-networkd[1476]: cali5b409cd6ec0: Gained carrier Mar 10 02:01:24.448553 containerd[1555]: time="2026-03-10T02:01:24.448287554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-w7tcg,Uid:ff236307-737f-4769-9a73-e0e60d4f60ef,Namespace:kube-system,Attempt:0,} returns sandbox id \"00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b\"" Mar 10 02:01:24.457173 systemd-resolved[1481]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 10 02:01:24.544661 containerd[1555]: time="2026-03-10T02:01:24.544395501Z" level=info msg="CreateContainer within sandbox \"00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:23.661 [INFO][4610] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--66bc5c9577--9wb7f-eth0 coredns-66bc5c9577- kube-system f5a2970b-ee52-4628-9153-e736ecce6760 1026 0 2026-03-10 01:59:26 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-66bc5c9577-9wb7f eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5b409cd6ec0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" Namespace="kube-system" Pod="coredns-66bc5c9577-9wb7f" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--9wb7f-" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:23.663 [INFO][4610] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" Namespace="kube-system" Pod="coredns-66bc5c9577-9wb7f" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--9wb7f-eth0" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:23.869 [INFO][4657] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" HandleID="k8s-pod-network.a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" Workload="localhost-k8s-coredns--66bc5c9577--9wb7f-eth0" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:23.914 [INFO][4657] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" HandleID="k8s-pod-network.a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" Workload="localhost-k8s-coredns--66bc5c9577--9wb7f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004902e0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-66bc5c9577-9wb7f", "timestamp":"2026-03-10 02:01:23.868437514 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001726e0)} Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:23.915 [INFO][4657] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:23.915 [INFO][4657] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:23.915 [INFO][4657] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:23.935 [INFO][4657] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" host="localhost" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:24.078 [INFO][4657] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:24.142 [INFO][4657] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:24.164 [INFO][4657] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:24.196 [INFO][4657] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:24.197 [INFO][4657] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" host="localhost" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:24.248 [INFO][4657] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:24.307 [INFO][4657] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" host="localhost" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:24.365 [INFO][4657] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" host="localhost" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:24.377 [INFO][4657] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" host="localhost" Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:24.377 [INFO][4657] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 10 02:01:24.551711 containerd[1555]: 2026-03-10 02:01:24.378 [INFO][4657] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" HandleID="k8s-pod-network.a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" Workload="localhost-k8s-coredns--66bc5c9577--9wb7f-eth0" Mar 10 02:01:24.555013 containerd[1555]: 2026-03-10 02:01:24.401 [INFO][4610] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" Namespace="kube-system" Pod="coredns-66bc5c9577-9wb7f" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--9wb7f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--9wb7f-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"f5a2970b-ee52-4628-9153-e736ecce6760", ResourceVersion:"1026", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 1, 59, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-66bc5c9577-9wb7f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5b409cd6ec0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:24.555013 containerd[1555]: 2026-03-10 02:01:24.404 [INFO][4610] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" Namespace="kube-system" Pod="coredns-66bc5c9577-9wb7f" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--9wb7f-eth0" Mar 10 02:01:24.555013 containerd[1555]: 2026-03-10 02:01:24.405 [INFO][4610] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5b409cd6ec0 ContainerID="a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" Namespace="kube-system" Pod="coredns-66bc5c9577-9wb7f" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--9wb7f-eth0" Mar 10 02:01:24.555013 containerd[1555]: 2026-03-10 02:01:24.412 [INFO][4610] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" Namespace="kube-system" Pod="coredns-66bc5c9577-9wb7f" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--9wb7f-eth0" Mar 10 02:01:24.555013 containerd[1555]: 2026-03-10 02:01:24.413 [INFO][4610] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" Namespace="kube-system" Pod="coredns-66bc5c9577-9wb7f" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--9wb7f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--9wb7f-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"f5a2970b-ee52-4628-9153-e736ecce6760", ResourceVersion:"1026", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 1, 59, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa", Pod:"coredns-66bc5c9577-9wb7f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5b409cd6ec0", MAC:"f6:e1:2f:ea:b5:e5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:24.555013 containerd[1555]: 2026-03-10 02:01:24.487 [INFO][4610] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" Namespace="kube-system" Pod="coredns-66bc5c9577-9wb7f" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--9wb7f-eth0" Mar 10 02:01:24.804379 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3558581873.mount: Deactivated successfully. Mar 10 02:01:24.928374 containerd[1555]: time="2026-03-10T02:01:24.926187027Z" level=info msg="Container 395bff113de88821ad41c3b059ba66f112a24a9a19864bf56dcb1495d80023db: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:01:24.933779 systemd-networkd[1476]: calid962a115249: Link UP Mar 10 02:01:24.934865 systemd-networkd[1476]: calid962a115249: Gained carrier Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:23.597 [INFO][4600] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--9jkrd-eth0 csi-node-driver- calico-system 17109937-e605-4060-a474-8701d718389b 810 0 2026-03-10 02:00:17 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-9jkrd eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid962a115249 [] [] }} ContainerID="a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" Namespace="calico-system" Pod="csi-node-driver-9jkrd" WorkloadEndpoint="localhost-k8s-csi--node--driver--9jkrd-" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:23.597 [INFO][4600] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" Namespace="calico-system" Pod="csi-node-driver-9jkrd" WorkloadEndpoint="localhost-k8s-csi--node--driver--9jkrd-eth0" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.007 [INFO][4651] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" HandleID="k8s-pod-network.a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" Workload="localhost-k8s-csi--node--driver--9jkrd-eth0" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.076 [INFO][4651] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" HandleID="k8s-pod-network.a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" Workload="localhost-k8s-csi--node--driver--9jkrd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003e1ef0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-9jkrd", "timestamp":"2026-03-10 02:01:24.007827383 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003dc2c0)} Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.077 [INFO][4651] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.379 [INFO][4651] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.392 [INFO][4651] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.411 [INFO][4651] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" host="localhost" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.479 [INFO][4651] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.547 [INFO][4651] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.582 [INFO][4651] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.613 [INFO][4651] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.631 [INFO][4651] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" host="localhost" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.669 [INFO][4651] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0 Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.717 [INFO][4651] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" host="localhost" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.790 [INFO][4651] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" host="localhost" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.791 [INFO][4651] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" host="localhost" Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.803 [INFO][4651] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 10 02:01:25.005172 containerd[1555]: 2026-03-10 02:01:24.803 [INFO][4651] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" HandleID="k8s-pod-network.a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" Workload="localhost-k8s-csi--node--driver--9jkrd-eth0" Mar 10 02:01:25.007789 containerd[1555]: 2026-03-10 02:01:24.869 [INFO][4600] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" Namespace="calico-system" Pod="csi-node-driver-9jkrd" WorkloadEndpoint="localhost-k8s-csi--node--driver--9jkrd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--9jkrd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"17109937-e605-4060-a474-8701d718389b", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 0, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-9jkrd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid962a115249", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:25.007789 containerd[1555]: 2026-03-10 02:01:24.870 [INFO][4600] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" Namespace="calico-system" Pod="csi-node-driver-9jkrd" WorkloadEndpoint="localhost-k8s-csi--node--driver--9jkrd-eth0" Mar 10 02:01:25.007789 containerd[1555]: 2026-03-10 02:01:24.870 [INFO][4600] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid962a115249 ContainerID="a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" Namespace="calico-system" Pod="csi-node-driver-9jkrd" WorkloadEndpoint="localhost-k8s-csi--node--driver--9jkrd-eth0" Mar 10 02:01:25.007789 containerd[1555]: 2026-03-10 02:01:24.927 [INFO][4600] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" Namespace="calico-system" Pod="csi-node-driver-9jkrd" WorkloadEndpoint="localhost-k8s-csi--node--driver--9jkrd-eth0" Mar 10 02:01:25.007789 containerd[1555]: 2026-03-10 02:01:24.932 [INFO][4600] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" Namespace="calico-system" Pod="csi-node-driver-9jkrd" WorkloadEndpoint="localhost-k8s-csi--node--driver--9jkrd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--9jkrd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"17109937-e605-4060-a474-8701d718389b", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 0, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0", Pod:"csi-node-driver-9jkrd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid962a115249", MAC:"76:2d:ab:96:65:86", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:25.007789 containerd[1555]: 2026-03-10 02:01:24.977 [INFO][4600] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" Namespace="calico-system" Pod="csi-node-driver-9jkrd" WorkloadEndpoint="localhost-k8s-csi--node--driver--9jkrd-eth0" Mar 10 02:01:25.151797 systemd-networkd[1476]: calibe458655065: Gained IPv6LL Mar 10 02:01:25.215208 containerd[1555]: time="2026-03-10T02:01:25.213537733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-d6c6bb9bb-bm4js,Uid:74613752-983e-41dc-b1b0-4ff0b2a9333f,Namespace:calico-system,Attempt:0,} returns sandbox id \"c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673\"" Mar 10 02:01:25.272618 containerd[1555]: time="2026-03-10T02:01:25.267149589Z" level=info msg="connecting to shim a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa" address="unix:///run/containerd/s/dfbbb62fe9674279fbd11b8e9bfd64332f6890ae4ea0589247d02295b2be30a8" namespace=k8s.io protocol=ttrpc version=3 Mar 10 02:01:25.277402 systemd-networkd[1476]: calif7a6fe6202a: Gained IPv6LL Mar 10 02:01:25.286516 containerd[1555]: time="2026-03-10T02:01:25.285581554Z" level=info msg="CreateContainer within sandbox \"00c3a84c00ffe6dbd643412225af891102eb8210df0b3d8152719f4bb361ca5b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"395bff113de88821ad41c3b059ba66f112a24a9a19864bf56dcb1495d80023db\"" Mar 10 02:01:25.302660 containerd[1555]: time="2026-03-10T02:01:25.299998132Z" level=info msg="StartContainer for \"395bff113de88821ad41c3b059ba66f112a24a9a19864bf56dcb1495d80023db\"" Mar 10 02:01:25.304297 containerd[1555]: time="2026-03-10T02:01:25.304255016Z" level=info msg="connecting to shim 395bff113de88821ad41c3b059ba66f112a24a9a19864bf56dcb1495d80023db" address="unix:///run/containerd/s/996f1ca71ec4e4541b17b4914857a434c6b6f7949e829500d5d88924f4814ab6" protocol=ttrpc version=3 Mar 10 02:01:25.389900 systemd-networkd[1476]: cali5bebcdfcd89: Link UP Mar 10 02:01:25.397951 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount512302246.mount: Deactivated successfully. Mar 10 02:01:25.404727 systemd-networkd[1476]: cali5bebcdfcd89: Gained carrier Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:24.655 [INFO][4774] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0 calico-apiserver-76dfd768bc- calico-system a0d4a605-efd0-4424-86bb-3dd6bae830b7 1021 0 2026-03-10 02:00:15 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76dfd768bc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-76dfd768bc-9l6p8 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali5bebcdfcd89 [] [] }} ContainerID="f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-9l6p8" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:24.676 [INFO][4774] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-9l6p8" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:24.976 [INFO][4829] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" HandleID="k8s-pod-network.f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" Workload="localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:24.997 [INFO][4829] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" HandleID="k8s-pod-network.f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" Workload="localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002bdd90), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-apiserver-76dfd768bc-9l6p8", "timestamp":"2026-03-10 02:01:24.976525528 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000378c60)} Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:24.997 [INFO][4829] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:24.997 [INFO][4829] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:24.997 [INFO][4829] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.010 [INFO][4829] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" host="localhost" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.043 [INFO][4829] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.097 [INFO][4829] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.104 [INFO][4829] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.119 [INFO][4829] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.121 [INFO][4829] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" host="localhost" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.129 [INFO][4829] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.165 [INFO][4829] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" host="localhost" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.248 [INFO][4829] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" host="localhost" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.248 [INFO][4829] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" host="localhost" Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.248 [INFO][4829] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 10 02:01:25.665573 containerd[1555]: 2026-03-10 02:01:25.248 [INFO][4829] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" HandleID="k8s-pod-network.f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" Workload="localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0" Mar 10 02:01:25.666884 containerd[1555]: 2026-03-10 02:01:25.297 [INFO][4774] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-9l6p8" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0", GenerateName:"calico-apiserver-76dfd768bc-", Namespace:"calico-system", SelfLink:"", UID:"a0d4a605-efd0-4424-86bb-3dd6bae830b7", ResourceVersion:"1021", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 0, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76dfd768bc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-76dfd768bc-9l6p8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali5bebcdfcd89", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:25.666884 containerd[1555]: 2026-03-10 02:01:25.298 [INFO][4774] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-9l6p8" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0" Mar 10 02:01:25.666884 containerd[1555]: 2026-03-10 02:01:25.298 [INFO][4774] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5bebcdfcd89 ContainerID="f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-9l6p8" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0" Mar 10 02:01:25.666884 containerd[1555]: 2026-03-10 02:01:25.394 [INFO][4774] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-9l6p8" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0" Mar 10 02:01:25.666884 containerd[1555]: 2026-03-10 02:01:25.417 [INFO][4774] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-9l6p8" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0", GenerateName:"calico-apiserver-76dfd768bc-", Namespace:"calico-system", SelfLink:"", UID:"a0d4a605-efd0-4424-86bb-3dd6bae830b7", ResourceVersion:"1021", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 0, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76dfd768bc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f", Pod:"calico-apiserver-76dfd768bc-9l6p8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali5bebcdfcd89", MAC:"da:a4:d4:9a:0b:91", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:25.666884 containerd[1555]: 2026-03-10 02:01:25.597 [INFO][4774] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" Namespace="calico-system" Pod="calico-apiserver-76dfd768bc-9l6p8" WorkloadEndpoint="localhost-k8s-calico--apiserver--76dfd768bc--9l6p8-eth0" Mar 10 02:01:25.673217 systemd[1]: Started cri-containerd-395bff113de88821ad41c3b059ba66f112a24a9a19864bf56dcb1495d80023db.scope - libcontainer container 395bff113de88821ad41c3b059ba66f112a24a9a19864bf56dcb1495d80023db. Mar 10 02:01:25.732332 systemd-networkd[1476]: cali5b409cd6ec0: Gained IPv6LL Mar 10 02:01:25.759260 systemd[1]: Started cri-containerd-a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa.scope - libcontainer container a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa. Mar 10 02:01:25.829005 containerd[1555]: time="2026-03-10T02:01:25.828861232Z" level=info msg="connecting to shim a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0" address="unix:///run/containerd/s/11b1501fd1084aa3c224449330c8cd6ca4afe67761418bd33cb3b936a96b00b8" namespace=k8s.io protocol=ttrpc version=3 Mar 10 02:01:25.988569 systemd-networkd[1476]: calid9166b5070b: Link UP Mar 10 02:01:25.994216 systemd-networkd[1476]: calid9166b5070b: Gained carrier Mar 10 02:01:26.066367 systemd-resolved[1481]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:24.853 [INFO][4776] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0 goldmane-cccfbd5cf- calico-system c169f044-feb9-4347-930b-a968995715e2 1018 0 2026-03-10 02:00:15 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-cccfbd5cf-kt2sm eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calid9166b5070b [] [] }} ContainerID="9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" Namespace="calico-system" Pod="goldmane-cccfbd5cf-kt2sm" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--kt2sm-" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:24.855 [INFO][4776] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" Namespace="calico-system" Pod="goldmane-cccfbd5cf-kt2sm" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.025 [INFO][4835] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" HandleID="k8s-pod-network.9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" Workload="localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.056 [INFO][4835] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" HandleID="k8s-pod-network.9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" Workload="localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004ff60), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-cccfbd5cf-kt2sm", "timestamp":"2026-03-10 02:01:25.025640672 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00037c6e0)} Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.060 [INFO][4835] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.253 [INFO][4835] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.253 [INFO][4835] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.287 [INFO][4835] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" host="localhost" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.507 [INFO][4835] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.634 [INFO][4835] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.681 [INFO][4835] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.732 [INFO][4835] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.758 [INFO][4835] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" host="localhost" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.840 [INFO][4835] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0 Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.905 [INFO][4835] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" host="localhost" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.962 [INFO][4835] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" host="localhost" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.963 [INFO][4835] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" host="localhost" Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.963 [INFO][4835] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 10 02:01:26.119270 containerd[1555]: 2026-03-10 02:01:25.963 [INFO][4835] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" HandleID="k8s-pod-network.9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" Workload="localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0" Mar 10 02:01:26.120210 containerd[1555]: 2026-03-10 02:01:25.976 [INFO][4776] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" Namespace="calico-system" Pod="goldmane-cccfbd5cf-kt2sm" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"c169f044-feb9-4347-930b-a968995715e2", ResourceVersion:"1018", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 0, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-cccfbd5cf-kt2sm", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid9166b5070b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:26.120210 containerd[1555]: 2026-03-10 02:01:25.976 [INFO][4776] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" Namespace="calico-system" Pod="goldmane-cccfbd5cf-kt2sm" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0" Mar 10 02:01:26.120210 containerd[1555]: 2026-03-10 02:01:25.976 [INFO][4776] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid9166b5070b ContainerID="9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" Namespace="calico-system" Pod="goldmane-cccfbd5cf-kt2sm" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0" Mar 10 02:01:26.120210 containerd[1555]: 2026-03-10 02:01:25.992 [INFO][4776] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" Namespace="calico-system" Pod="goldmane-cccfbd5cf-kt2sm" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0" Mar 10 02:01:26.120210 containerd[1555]: 2026-03-10 02:01:26.003 [INFO][4776] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" Namespace="calico-system" Pod="goldmane-cccfbd5cf-kt2sm" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"c169f044-feb9-4347-930b-a968995715e2", ResourceVersion:"1018", Generation:0, CreationTimestamp:time.Date(2026, time.March, 10, 2, 0, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0", Pod:"goldmane-cccfbd5cf-kt2sm", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calid9166b5070b", MAC:"3a:46:20:41:de:9c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 10 02:01:26.120210 containerd[1555]: 2026-03-10 02:01:26.074 [INFO][4776] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" Namespace="calico-system" Pod="goldmane-cccfbd5cf-kt2sm" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--kt2sm-eth0" Mar 10 02:01:26.125259 containerd[1555]: time="2026-03-10T02:01:26.117887825Z" level=info msg="connecting to shim f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f" address="unix:///run/containerd/s/00675a39ac826eb64cc777eabd6bf10441212da9965e03f0fd93b68426e7a65b" namespace=k8s.io protocol=ttrpc version=3 Mar 10 02:01:26.215744 containerd[1555]: time="2026-03-10T02:01:26.215648965Z" level=info msg="StartContainer for \"395bff113de88821ad41c3b059ba66f112a24a9a19864bf56dcb1495d80023db\" returns successfully" Mar 10 02:01:26.250840 systemd[1]: Started cri-containerd-a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0.scope - libcontainer container a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0. Mar 10 02:01:26.363163 containerd[1555]: time="2026-03-10T02:01:26.363046706Z" level=info msg="connecting to shim 9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0" address="unix:///run/containerd/s/96e080018239e13107ef4e7dea8f3a3683647d778c0a0e86ae0ef80446fdd7e9" namespace=k8s.io protocol=ttrpc version=3 Mar 10 02:01:26.371611 systemd[1]: Started cri-containerd-f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f.scope - libcontainer container f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f. Mar 10 02:01:26.544352 systemd-resolved[1481]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 10 02:01:26.557933 containerd[1555]: time="2026-03-10T02:01:26.544098471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9wb7f,Uid:f5a2970b-ee52-4628-9153-e736ecce6760,Namespace:kube-system,Attempt:0,} returns sandbox id \"a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa\"" Mar 10 02:01:26.624720 systemd-resolved[1481]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 10 02:01:26.648674 containerd[1555]: time="2026-03-10T02:01:26.647147000Z" level=info msg="CreateContainer within sandbox \"a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 10 02:01:26.675646 systemd[1]: Started cri-containerd-9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0.scope - libcontainer container 9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0. Mar 10 02:01:26.690098 systemd-networkd[1476]: calid962a115249: Gained IPv6LL Mar 10 02:01:26.760806 containerd[1555]: time="2026-03-10T02:01:26.760686721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9jkrd,Uid:17109937-e605-4060-a474-8701d718389b,Namespace:calico-system,Attempt:0,} returns sandbox id \"a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0\"" Mar 10 02:01:26.811712 containerd[1555]: time="2026-03-10T02:01:26.810380877Z" level=info msg="Container 5edf53229e5e2a7819c7d088ff57fae08b730c9012e1102aabab9fbf900dc192: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:01:26.823111 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1899892809.mount: Deactivated successfully. Mar 10 02:01:26.878373 containerd[1555]: time="2026-03-10T02:01:26.878196831Z" level=info msg="CreateContainer within sandbox \"a8c0778665fc38fd79cee24fa20dbaa87ccbc2ac8ae243c562182f22adec2bfa\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5edf53229e5e2a7819c7d088ff57fae08b730c9012e1102aabab9fbf900dc192\"" Mar 10 02:01:26.880080 containerd[1555]: time="2026-03-10T02:01:26.879820759Z" level=info msg="StartContainer for \"5edf53229e5e2a7819c7d088ff57fae08b730c9012e1102aabab9fbf900dc192\"" Mar 10 02:01:26.883930 containerd[1555]: time="2026-03-10T02:01:26.883896944Z" level=info msg="connecting to shim 5edf53229e5e2a7819c7d088ff57fae08b730c9012e1102aabab9fbf900dc192" address="unix:///run/containerd/s/dfbbb62fe9674279fbd11b8e9bfd64332f6890ae4ea0589247d02295b2be30a8" protocol=ttrpc version=3 Mar 10 02:01:26.893872 systemd-resolved[1481]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 10 02:01:26.912495 containerd[1555]: time="2026-03-10T02:01:26.911935668Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76dfd768bc-9l6p8,Uid:a0d4a605-efd0-4424-86bb-3dd6bae830b7,Namespace:calico-system,Attempt:0,} returns sandbox id \"f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f\"" Mar 10 02:01:26.982839 systemd[1]: Started cri-containerd-5edf53229e5e2a7819c7d088ff57fae08b730c9012e1102aabab9fbf900dc192.scope - libcontainer container 5edf53229e5e2a7819c7d088ff57fae08b730c9012e1102aabab9fbf900dc192. Mar 10 02:01:27.078787 containerd[1555]: time="2026-03-10T02:01:27.078567018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-kt2sm,Uid:c169f044-feb9-4347-930b-a968995715e2,Namespace:calico-system,Attempt:0,} returns sandbox id \"9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0\"" Mar 10 02:01:27.259604 systemd-networkd[1476]: calid9166b5070b: Gained IPv6LL Mar 10 02:01:27.388310 systemd-networkd[1476]: cali5bebcdfcd89: Gained IPv6LL Mar 10 02:01:27.400264 containerd[1555]: time="2026-03-10T02:01:27.400080306Z" level=info msg="StartContainer for \"5edf53229e5e2a7819c7d088ff57fae08b730c9012e1102aabab9fbf900dc192\" returns successfully" Mar 10 02:01:27.564103 kubelet[2820]: I0310 02:01:27.563319 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-w7tcg" podStartSLOduration=120.563296143 podStartE2EDuration="2m0.563296143s" podCreationTimestamp="2026-03-10 01:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 02:01:27.355521773 +0000 UTC m=+123.794135684" watchObservedRunningTime="2026-03-10 02:01:27.563296143 +0000 UTC m=+124.001910045" Mar 10 02:01:28.531432 kubelet[2820]: I0310 02:01:28.531309 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-9wb7f" podStartSLOduration=122.531286648 podStartE2EDuration="2m2.531286648s" podCreationTimestamp="2026-03-10 01:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 02:01:28.410389076 +0000 UTC m=+124.849002997" watchObservedRunningTime="2026-03-10 02:01:28.531286648 +0000 UTC m=+124.969900550" Mar 10 02:01:33.942734 containerd[1555]: time="2026-03-10T02:01:33.939253609Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:33.943527 containerd[1555]: time="2026-03-10T02:01:33.942831511Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 10 02:01:33.947008 containerd[1555]: time="2026-03-10T02:01:33.946843649Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:33.958858 containerd[1555]: time="2026-03-10T02:01:33.957706815Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:33.962736 containerd[1555]: time="2026-03-10T02:01:33.962236568Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 11.195604743s" Mar 10 02:01:33.962736 containerd[1555]: time="2026-03-10T02:01:33.962365111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 10 02:01:33.981597 containerd[1555]: time="2026-03-10T02:01:33.981047787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 10 02:01:34.005615 containerd[1555]: time="2026-03-10T02:01:34.003242007Z" level=info msg="CreateContainer within sandbox \"835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 10 02:01:34.113425 containerd[1555]: time="2026-03-10T02:01:34.104970415Z" level=info msg="Container 627fe4169fb636e72762a3b8a22ad541d62dca2397c309455e694dc8b16a481a: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:01:34.151652 containerd[1555]: time="2026-03-10T02:01:34.149239622Z" level=info msg="CreateContainer within sandbox \"835fa23733b6277648529a87797635ab7befb5030b1001b1d1fc15d595416850\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"627fe4169fb636e72762a3b8a22ad541d62dca2397c309455e694dc8b16a481a\"" Mar 10 02:01:34.164886 containerd[1555]: time="2026-03-10T02:01:34.157937039Z" level=info msg="StartContainer for \"627fe4169fb636e72762a3b8a22ad541d62dca2397c309455e694dc8b16a481a\"" Mar 10 02:01:34.164886 containerd[1555]: time="2026-03-10T02:01:34.163947620Z" level=info msg="connecting to shim 627fe4169fb636e72762a3b8a22ad541d62dca2397c309455e694dc8b16a481a" address="unix:///run/containerd/s/4467b9c8f50c41bdf15456e0b50b79296f7cc1da228e4a257274d074f3abcf3f" protocol=ttrpc version=3 Mar 10 02:01:34.326063 systemd[1]: Started cri-containerd-627fe4169fb636e72762a3b8a22ad541d62dca2397c309455e694dc8b16a481a.scope - libcontainer container 627fe4169fb636e72762a3b8a22ad541d62dca2397c309455e694dc8b16a481a. Mar 10 02:01:34.734569 containerd[1555]: time="2026-03-10T02:01:34.733660425Z" level=info msg="StartContainer for \"627fe4169fb636e72762a3b8a22ad541d62dca2397c309455e694dc8b16a481a\" returns successfully" Mar 10 02:01:35.652537 kubelet[2820]: I0310 02:01:35.650646 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-76dfd768bc-pv7bx" podStartSLOduration=69.437892776 podStartE2EDuration="1m20.650547732s" podCreationTimestamp="2026-03-10 02:00:15 +0000 UTC" firstStartedPulling="2026-03-10 02:01:22.760148096 +0000 UTC m=+119.198761997" lastFinishedPulling="2026-03-10 02:01:33.972803042 +0000 UTC m=+130.411416953" observedRunningTime="2026-03-10 02:01:35.644242456 +0000 UTC m=+132.082856377" watchObservedRunningTime="2026-03-10 02:01:35.650547732 +0000 UTC m=+132.089161663" Mar 10 02:01:46.417114 containerd[1555]: time="2026-03-10T02:01:46.413913020Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:46.419119 containerd[1555]: time="2026-03-10T02:01:46.418057944Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 10 02:01:46.423098 containerd[1555]: time="2026-03-10T02:01:46.419910489Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:46.431142 containerd[1555]: time="2026-03-10T02:01:46.429165092Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:46.431142 containerd[1555]: time="2026-03-10T02:01:46.430342403Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 12.449243924s" Mar 10 02:01:46.431142 containerd[1555]: time="2026-03-10T02:01:46.430379351Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 10 02:01:46.447516 containerd[1555]: time="2026-03-10T02:01:46.446610802Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 10 02:01:46.522556 containerd[1555]: time="2026-03-10T02:01:46.520914286Z" level=info msg="CreateContainer within sandbox \"c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 10 02:01:46.603527 containerd[1555]: time="2026-03-10T02:01:46.600552253Z" level=info msg="Container 31658143da66d55044e3cfc52ffe5351ddc99906ff9c0280ce7583b13307b687: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:01:46.679081 containerd[1555]: time="2026-03-10T02:01:46.677862319Z" level=info msg="CreateContainer within sandbox \"c1a8e35d4840fdfabd2a7b9da44764f4afec82586d2abd635466314a887c7673\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"31658143da66d55044e3cfc52ffe5351ddc99906ff9c0280ce7583b13307b687\"" Mar 10 02:01:46.691338 containerd[1555]: time="2026-03-10T02:01:46.688388054Z" level=info msg="StartContainer for \"31658143da66d55044e3cfc52ffe5351ddc99906ff9c0280ce7583b13307b687\"" Mar 10 02:01:46.712340 containerd[1555]: time="2026-03-10T02:01:46.711762530Z" level=info msg="connecting to shim 31658143da66d55044e3cfc52ffe5351ddc99906ff9c0280ce7583b13307b687" address="unix:///run/containerd/s/a09e6682ed37611db0d3f174897ef8f0bb92a57542c955609970a321e7d9092c" protocol=ttrpc version=3 Mar 10 02:01:46.834709 systemd[1]: Started cri-containerd-31658143da66d55044e3cfc52ffe5351ddc99906ff9c0280ce7583b13307b687.scope - libcontainer container 31658143da66d55044e3cfc52ffe5351ddc99906ff9c0280ce7583b13307b687. Mar 10 02:01:47.198213 containerd[1555]: time="2026-03-10T02:01:47.198042759Z" level=info msg="StartContainer for \"31658143da66d55044e3cfc52ffe5351ddc99906ff9c0280ce7583b13307b687\" returns successfully" Mar 10 02:01:47.888613 kubelet[2820]: I0310 02:01:47.884375 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-d6c6bb9bb-bm4js" podStartSLOduration=69.670321704 podStartE2EDuration="1m30.884358251s" podCreationTimestamp="2026-03-10 02:00:17 +0000 UTC" firstStartedPulling="2026-03-10 02:01:25.222737088 +0000 UTC m=+121.661350989" lastFinishedPulling="2026-03-10 02:01:46.436773635 +0000 UTC m=+142.875387536" observedRunningTime="2026-03-10 02:01:47.88420458 +0000 UTC m=+144.322818491" watchObservedRunningTime="2026-03-10 02:01:47.884358251 +0000 UTC m=+144.322972152" Mar 10 02:01:48.306762 containerd[1555]: time="2026-03-10T02:01:48.306138851Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:48.309248 containerd[1555]: time="2026-03-10T02:01:48.307428140Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 10 02:01:48.311426 containerd[1555]: time="2026-03-10T02:01:48.311340623Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:48.318214 containerd[1555]: time="2026-03-10T02:01:48.318138763Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:48.336850 containerd[1555]: time="2026-03-10T02:01:48.336159761Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 1.889461097s" Mar 10 02:01:48.336850 containerd[1555]: time="2026-03-10T02:01:48.336240007Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 10 02:01:48.350241 containerd[1555]: time="2026-03-10T02:01:48.349400620Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 10 02:01:48.359213 containerd[1555]: time="2026-03-10T02:01:48.359133531Z" level=info msg="CreateContainer within sandbox \"a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 10 02:01:48.426522 containerd[1555]: time="2026-03-10T02:01:48.422440027Z" level=info msg="Container dfccd4b184f63237091f414b7e1157424e9432639e41c459e120dc6a4c456311: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:01:48.426615 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3531328773.mount: Deactivated successfully. Mar 10 02:01:48.447852 containerd[1555]: time="2026-03-10T02:01:48.447763394Z" level=info msg="CreateContainer within sandbox \"a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"dfccd4b184f63237091f414b7e1157424e9432639e41c459e120dc6a4c456311\"" Mar 10 02:01:48.452408 containerd[1555]: time="2026-03-10T02:01:48.452340741Z" level=info msg="StartContainer for \"dfccd4b184f63237091f414b7e1157424e9432639e41c459e120dc6a4c456311\"" Mar 10 02:01:48.457782 containerd[1555]: time="2026-03-10T02:01:48.457676024Z" level=info msg="connecting to shim dfccd4b184f63237091f414b7e1157424e9432639e41c459e120dc6a4c456311" address="unix:///run/containerd/s/11b1501fd1084aa3c224449330c8cd6ca4afe67761418bd33cb3b936a96b00b8" protocol=ttrpc version=3 Mar 10 02:01:48.550813 containerd[1555]: time="2026-03-10T02:01:48.549680840Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:48.556311 containerd[1555]: time="2026-03-10T02:01:48.555033805Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 10 02:01:48.576551 containerd[1555]: time="2026-03-10T02:01:48.562271150Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 211.61903ms" Mar 10 02:01:48.576551 containerd[1555]: time="2026-03-10T02:01:48.573243846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 10 02:01:48.591280 containerd[1555]: time="2026-03-10T02:01:48.591160482Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 10 02:01:48.611690 systemd[1]: Started cri-containerd-dfccd4b184f63237091f414b7e1157424e9432639e41c459e120dc6a4c456311.scope - libcontainer container dfccd4b184f63237091f414b7e1157424e9432639e41c459e120dc6a4c456311. Mar 10 02:01:48.620512 containerd[1555]: time="2026-03-10T02:01:48.620331333Z" level=info msg="CreateContainer within sandbox \"f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 10 02:01:48.841725 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1667241419.mount: Deactivated successfully. Mar 10 02:01:48.864061 containerd[1555]: time="2026-03-10T02:01:48.860239183Z" level=info msg="Container ee713983c8d2097952a09dbbdd746274bbbc7ade2529391108b5d0518fd7116f: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:01:48.983323 containerd[1555]: time="2026-03-10T02:01:48.981871470Z" level=info msg="CreateContainer within sandbox \"f182e649fe601d6e3129bf82fd5a5bb5988f2fead7af470ca6c4028385b98b3f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ee713983c8d2097952a09dbbdd746274bbbc7ade2529391108b5d0518fd7116f\"" Mar 10 02:01:48.983323 containerd[1555]: time="2026-03-10T02:01:48.982741802Z" level=info msg="StartContainer for \"ee713983c8d2097952a09dbbdd746274bbbc7ade2529391108b5d0518fd7116f\"" Mar 10 02:01:49.036132 containerd[1555]: time="2026-03-10T02:01:49.030551508Z" level=info msg="connecting to shim ee713983c8d2097952a09dbbdd746274bbbc7ade2529391108b5d0518fd7116f" address="unix:///run/containerd/s/00675a39ac826eb64cc777eabd6bf10441212da9965e03f0fd93b68426e7a65b" protocol=ttrpc version=3 Mar 10 02:01:49.344257 systemd[1]: Started cri-containerd-ee713983c8d2097952a09dbbdd746274bbbc7ade2529391108b5d0518fd7116f.scope - libcontainer container ee713983c8d2097952a09dbbdd746274bbbc7ade2529391108b5d0518fd7116f. Mar 10 02:01:49.393218 containerd[1555]: time="2026-03-10T02:01:49.393121379Z" level=info msg="StartContainer for \"dfccd4b184f63237091f414b7e1157424e9432639e41c459e120dc6a4c456311\" returns successfully" Mar 10 02:01:49.977291 containerd[1555]: time="2026-03-10T02:01:49.977192536Z" level=info msg="StartContainer for \"ee713983c8d2097952a09dbbdd746274bbbc7ade2529391108b5d0518fd7116f\" returns successfully" Mar 10 02:01:51.055274 kubelet[2820]: I0310 02:01:51.055160 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-76dfd768bc-9l6p8" podStartSLOduration=74.388748119 podStartE2EDuration="1m36.055140144s" podCreationTimestamp="2026-03-10 02:00:15 +0000 UTC" firstStartedPulling="2026-03-10 02:01:26.916030407 +0000 UTC m=+123.354644308" lastFinishedPulling="2026-03-10 02:01:48.582422422 +0000 UTC m=+145.021036333" observedRunningTime="2026-03-10 02:01:51.054893322 +0000 UTC m=+147.493507243" watchObservedRunningTime="2026-03-10 02:01:51.055140144 +0000 UTC m=+147.493754065" Mar 10 02:01:53.012104 kubelet[2820]: I0310 02:01:53.011980 2820 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 02:01:55.600173 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1586041576.mount: Deactivated successfully. Mar 10 02:01:59.588084 containerd[1555]: time="2026-03-10T02:01:59.587931479Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:59.594744 containerd[1555]: time="2026-03-10T02:01:59.594070507Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 10 02:01:59.597825 containerd[1555]: time="2026-03-10T02:01:59.597147497Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:59.608540 containerd[1555]: time="2026-03-10T02:01:59.608312969Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:01:59.610126 containerd[1555]: time="2026-03-10T02:01:59.609216208Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 11.017975489s" Mar 10 02:01:59.610126 containerd[1555]: time="2026-03-10T02:01:59.609708681Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 10 02:01:59.620373 containerd[1555]: time="2026-03-10T02:01:59.619867188Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 10 02:01:59.646895 containerd[1555]: time="2026-03-10T02:01:59.646799052Z" level=info msg="CreateContainer within sandbox \"9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 10 02:01:59.690179 containerd[1555]: time="2026-03-10T02:01:59.690065210Z" level=info msg="Container bfbe70e1ecde6ea82aa778659446d1368fa047ddece2adf903b64d5c64fb3882: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:01:59.713559 containerd[1555]: time="2026-03-10T02:01:59.713322851Z" level=info msg="CreateContainer within sandbox \"9b364df2c79b5ca5172422cece5543efd9271a21f41c029d4d6913b9db43aeb0\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"bfbe70e1ecde6ea82aa778659446d1368fa047ddece2adf903b64d5c64fb3882\"" Mar 10 02:01:59.717827 containerd[1555]: time="2026-03-10T02:01:59.714802848Z" level=info msg="StartContainer for \"bfbe70e1ecde6ea82aa778659446d1368fa047ddece2adf903b64d5c64fb3882\"" Mar 10 02:01:59.717827 containerd[1555]: time="2026-03-10T02:01:59.717648311Z" level=info msg="connecting to shim bfbe70e1ecde6ea82aa778659446d1368fa047ddece2adf903b64d5c64fb3882" address="unix:///run/containerd/s/96e080018239e13107ef4e7dea8f3a3683647d778c0a0e86ae0ef80446fdd7e9" protocol=ttrpc version=3 Mar 10 02:01:59.854723 systemd[1]: Started cri-containerd-bfbe70e1ecde6ea82aa778659446d1368fa047ddece2adf903b64d5c64fb3882.scope - libcontainer container bfbe70e1ecde6ea82aa778659446d1368fa047ddece2adf903b64d5c64fb3882. Mar 10 02:02:00.333082 containerd[1555]: time="2026-03-10T02:02:00.332943048Z" level=info msg="StartContainer for \"bfbe70e1ecde6ea82aa778659446d1368fa047ddece2adf903b64d5c64fb3882\" returns successfully" Mar 10 02:02:01.655780 kubelet[2820]: I0310 02:02:01.654802 2820 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-kt2sm" podStartSLOduration=74.113086484 podStartE2EDuration="1m46.643766326s" podCreationTimestamp="2026-03-10 02:00:15 +0000 UTC" firstStartedPulling="2026-03-10 02:01:27.087227884 +0000 UTC m=+123.525841786" lastFinishedPulling="2026-03-10 02:01:59.617907728 +0000 UTC m=+156.056521628" observedRunningTime="2026-03-10 02:02:01.276392022 +0000 UTC m=+157.715005924" watchObservedRunningTime="2026-03-10 02:02:01.643766326 +0000 UTC m=+158.082380248" Mar 10 02:02:02.747578 containerd[1555]: time="2026-03-10T02:02:02.746259104Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:02:02.844851 containerd[1555]: time="2026-03-10T02:02:02.756795499Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 10 02:02:02.844851 containerd[1555]: time="2026-03-10T02:02:02.776708710Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:02:02.844851 containerd[1555]: time="2026-03-10T02:02:02.795029507Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 3.17508673s" Mar 10 02:02:02.844851 containerd[1555]: time="2026-03-10T02:02:02.842249665Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 10 02:02:02.844851 containerd[1555]: time="2026-03-10T02:02:02.842771788Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 10 02:02:02.875536 containerd[1555]: time="2026-03-10T02:02:02.874520832Z" level=info msg="CreateContainer within sandbox \"a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 10 02:02:03.129687 containerd[1555]: time="2026-03-10T02:02:03.128897341Z" level=info msg="Container 72adf9f217ee083668fb130b9831b38a1cb80879c560836a8b6995030803539d: CDI devices from CRI Config.CDIDevices: []" Mar 10 02:02:03.278306 containerd[1555]: time="2026-03-10T02:02:03.263315194Z" level=info msg="CreateContainer within sandbox \"a299fae18aecbb93ffb135400069352ee2ef39ea160c1e1744cd8e7b108d97b0\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"72adf9f217ee083668fb130b9831b38a1cb80879c560836a8b6995030803539d\"" Mar 10 02:02:03.327532 containerd[1555]: time="2026-03-10T02:02:03.320290131Z" level=info msg="StartContainer for \"72adf9f217ee083668fb130b9831b38a1cb80879c560836a8b6995030803539d\"" Mar 10 02:02:03.329074 containerd[1555]: time="2026-03-10T02:02:03.329030144Z" level=info msg="connecting to shim 72adf9f217ee083668fb130b9831b38a1cb80879c560836a8b6995030803539d" address="unix:///run/containerd/s/11b1501fd1084aa3c224449330c8cd6ca4afe67761418bd33cb3b936a96b00b8" protocol=ttrpc version=3 Mar 10 02:02:03.517985 systemd[1]: Started cri-containerd-72adf9f217ee083668fb130b9831b38a1cb80879c560836a8b6995030803539d.scope - libcontainer container 72adf9f217ee083668fb130b9831b38a1cb80879c560836a8b6995030803539d. Mar 10 02:02:03.813570 containerd[1555]: time="2026-03-10T02:02:03.813021953Z" level=info msg="StartContainer for \"72adf9f217ee083668fb130b9831b38a1cb80879c560836a8b6995030803539d\" returns successfully" Mar 10 02:02:04.091472 kubelet[2820]: I0310 02:02:04.091262 2820 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 10 02:02:04.097828 kubelet[2820]: I0310 02:02:04.097199 2820 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 10 02:02:20.362411 systemd[1]: Started sshd@9-10.0.0.74:22-10.0.0.1:40982.service - OpenSSH per-connection server daemon (10.0.0.1:40982). Mar 10 02:02:20.687302 sshd[5682]: Accepted publickey for core from 10.0.0.1 port 40982 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:02:20.690034 sshd-session[5682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:02:20.706231 systemd-logind[1537]: New session 10 of user core. Mar 10 02:02:20.713231 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 10 02:02:21.655429 sshd[5694]: Connection closed by 10.0.0.1 port 40982 Mar 10 02:02:21.655704 sshd-session[5682]: pam_unix(sshd:session): session closed for user core Mar 10 02:02:21.667837 systemd[1]: sshd@9-10.0.0.74:22-10.0.0.1:40982.service: Deactivated successfully. Mar 10 02:02:21.672639 systemd[1]: session-10.scope: Deactivated successfully. Mar 10 02:02:21.676042 systemd-logind[1537]: Session 10 logged out. Waiting for processes to exit. Mar 10 02:02:21.686563 systemd-logind[1537]: Removed session 10. Mar 10 02:02:26.703812 systemd[1]: Started sshd@10-10.0.0.74:22-10.0.0.1:40992.service - OpenSSH per-connection server daemon (10.0.0.1:40992). Mar 10 02:02:26.859425 sshd[5716]: Accepted publickey for core from 10.0.0.1 port 40992 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:02:26.862548 sshd-session[5716]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:02:26.898360 systemd-logind[1537]: New session 11 of user core. Mar 10 02:02:26.920936 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 10 02:02:27.210270 sshd[5722]: Connection closed by 10.0.0.1 port 40992 Mar 10 02:02:27.210993 sshd-session[5716]: pam_unix(sshd:session): session closed for user core Mar 10 02:02:27.221740 systemd[1]: sshd@10-10.0.0.74:22-10.0.0.1:40992.service: Deactivated successfully. Mar 10 02:02:27.226293 systemd[1]: session-11.scope: Deactivated successfully. Mar 10 02:02:27.234020 systemd-logind[1537]: Session 11 logged out. Waiting for processes to exit. Mar 10 02:02:27.240588 systemd-logind[1537]: Removed session 11. Mar 10 02:02:32.261536 systemd[1]: Started sshd@11-10.0.0.74:22-10.0.0.1:58824.service - OpenSSH per-connection server daemon (10.0.0.1:58824). Mar 10 02:02:32.379198 sshd[5759]: Accepted publickey for core from 10.0.0.1 port 58824 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:02:32.382255 sshd-session[5759]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:02:32.396849 systemd-logind[1537]: New session 12 of user core. Mar 10 02:02:32.414854 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 10 02:02:32.676843 sshd[5762]: Connection closed by 10.0.0.1 port 58824 Mar 10 02:02:32.679243 sshd-session[5759]: pam_unix(sshd:session): session closed for user core Mar 10 02:02:32.700117 systemd[1]: sshd@11-10.0.0.74:22-10.0.0.1:58824.service: Deactivated successfully. Mar 10 02:02:32.708139 systemd[1]: session-12.scope: Deactivated successfully. Mar 10 02:02:32.714020 systemd-logind[1537]: Session 12 logged out. Waiting for processes to exit. Mar 10 02:02:32.720304 systemd-logind[1537]: Removed session 12. Mar 10 02:02:37.701925 systemd[1]: Started sshd@12-10.0.0.74:22-10.0.0.1:58838.service - OpenSSH per-connection server daemon (10.0.0.1:58838). Mar 10 02:02:37.800289 sshd[5780]: Accepted publickey for core from 10.0.0.1 port 58838 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:02:37.804091 sshd-session[5780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:02:37.824829 systemd-logind[1537]: New session 13 of user core. Mar 10 02:02:37.834791 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 10 02:02:38.191026 sshd[5783]: Connection closed by 10.0.0.1 port 58838 Mar 10 02:02:38.194748 sshd-session[5780]: pam_unix(sshd:session): session closed for user core Mar 10 02:02:38.202559 systemd[1]: sshd@12-10.0.0.74:22-10.0.0.1:58838.service: Deactivated successfully. Mar 10 02:02:38.207518 systemd[1]: session-13.scope: Deactivated successfully. Mar 10 02:02:38.216303 systemd-logind[1537]: Session 13 logged out. Waiting for processes to exit. Mar 10 02:02:38.222031 systemd-logind[1537]: Removed session 13. Mar 10 02:02:43.232235 systemd[1]: Started sshd@13-10.0.0.74:22-10.0.0.1:54650.service - OpenSSH per-connection server daemon (10.0.0.1:54650). Mar 10 02:02:43.370556 sshd[5833]: Accepted publickey for core from 10.0.0.1 port 54650 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:02:43.373239 sshd-session[5833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:02:43.398000 systemd-logind[1537]: New session 14 of user core. Mar 10 02:02:43.406049 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 10 02:02:43.695368 sshd[5836]: Connection closed by 10.0.0.1 port 54650 Mar 10 02:02:43.696031 sshd-session[5833]: pam_unix(sshd:session): session closed for user core Mar 10 02:02:43.716116 systemd[1]: sshd@13-10.0.0.74:22-10.0.0.1:54650.service: Deactivated successfully. Mar 10 02:02:43.723406 systemd[1]: session-14.scope: Deactivated successfully. Mar 10 02:02:43.730293 systemd-logind[1537]: Session 14 logged out. Waiting for processes to exit. Mar 10 02:02:43.744093 systemd-logind[1537]: Removed session 14. Mar 10 02:02:48.747264 systemd[1]: Started sshd@14-10.0.0.74:22-10.0.0.1:54658.service - OpenSSH per-connection server daemon (10.0.0.1:54658). Mar 10 02:02:48.862951 sshd[5870]: Accepted publickey for core from 10.0.0.1 port 54658 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:02:48.876788 sshd-session[5870]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:02:48.897979 systemd-logind[1537]: New session 15 of user core. Mar 10 02:02:48.909050 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 10 02:02:49.123066 sshd[5893]: Connection closed by 10.0.0.1 port 54658 Mar 10 02:02:49.123430 sshd-session[5870]: pam_unix(sshd:session): session closed for user core Mar 10 02:02:49.130743 systemd[1]: sshd@14-10.0.0.74:22-10.0.0.1:54658.service: Deactivated successfully. Mar 10 02:02:49.134602 systemd[1]: session-15.scope: Deactivated successfully. Mar 10 02:02:49.137371 systemd-logind[1537]: Session 15 logged out. Waiting for processes to exit. Mar 10 02:02:49.140963 systemd-logind[1537]: Removed session 15. Mar 10 02:02:52.395241 update_engine[1544]: I20260310 02:02:52.394923 1544 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 10 02:02:52.395241 update_engine[1544]: I20260310 02:02:52.395176 1544 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 10 02:02:52.398860 update_engine[1544]: I20260310 02:02:52.398154 1544 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 10 02:02:52.400162 update_engine[1544]: I20260310 02:02:52.400066 1544 omaha_request_params.cc:62] Current group set to stable Mar 10 02:02:52.400393 update_engine[1544]: I20260310 02:02:52.400308 1544 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 10 02:02:52.400393 update_engine[1544]: I20260310 02:02:52.400355 1544 update_attempter.cc:643] Scheduling an action processor start. Mar 10 02:02:52.400393 update_engine[1544]: I20260310 02:02:52.400384 1544 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 10 02:02:52.400716 update_engine[1544]: I20260310 02:02:52.400523 1544 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 10 02:02:52.402734 update_engine[1544]: I20260310 02:02:52.401650 1544 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 10 02:02:52.402734 update_engine[1544]: I20260310 02:02:52.401697 1544 omaha_request_action.cc:272] Request: Mar 10 02:02:52.402734 update_engine[1544]: Mar 10 02:02:52.402734 update_engine[1544]: Mar 10 02:02:52.402734 update_engine[1544]: Mar 10 02:02:52.402734 update_engine[1544]: Mar 10 02:02:52.402734 update_engine[1544]: Mar 10 02:02:52.402734 update_engine[1544]: Mar 10 02:02:52.402734 update_engine[1544]: Mar 10 02:02:52.402734 update_engine[1544]: Mar 10 02:02:52.402734 update_engine[1544]: I20260310 02:02:52.401711 1544 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 10 02:02:52.431658 update_engine[1544]: I20260310 02:02:52.431041 1544 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 10 02:02:52.431801 locksmithd[1585]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 10 02:02:52.433032 update_engine[1544]: I20260310 02:02:52.432806 1544 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 10 02:02:52.453701 update_engine[1544]: E20260310 02:02:52.453433 1544 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 10 02:02:52.453993 update_engine[1544]: I20260310 02:02:52.453730 1544 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 10 02:02:54.153201 systemd[1]: Started sshd@15-10.0.0.74:22-10.0.0.1:58134.service - OpenSSH per-connection server daemon (10.0.0.1:58134). Mar 10 02:02:54.287993 sshd[5933]: Accepted publickey for core from 10.0.0.1 port 58134 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:02:54.291429 sshd-session[5933]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:02:54.304625 systemd-logind[1537]: New session 16 of user core. Mar 10 02:02:54.314013 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 10 02:02:54.537577 sshd[5936]: Connection closed by 10.0.0.1 port 58134 Mar 10 02:02:54.539434 sshd-session[5933]: pam_unix(sshd:session): session closed for user core Mar 10 02:02:54.548435 systemd[1]: sshd@15-10.0.0.74:22-10.0.0.1:58134.service: Deactivated successfully. Mar 10 02:02:54.551588 systemd[1]: session-16.scope: Deactivated successfully. Mar 10 02:02:54.554316 systemd-logind[1537]: Session 16 logged out. Waiting for processes to exit. Mar 10 02:02:54.556900 systemd-logind[1537]: Removed session 16. Mar 10 02:02:59.573665 systemd[1]: Started sshd@16-10.0.0.74:22-10.0.0.1:58148.service - OpenSSH per-connection server daemon (10.0.0.1:58148). Mar 10 02:02:59.767941 sshd[5951]: Accepted publickey for core from 10.0.0.1 port 58148 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:02:59.767307 sshd-session[5951]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:02:59.784868 systemd-logind[1537]: New session 17 of user core. Mar 10 02:02:59.799885 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 10 02:03:00.030334 sshd[5954]: Connection closed by 10.0.0.1 port 58148 Mar 10 02:03:00.030746 sshd-session[5951]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:00.037033 systemd[1]: sshd@16-10.0.0.74:22-10.0.0.1:58148.service: Deactivated successfully. Mar 10 02:03:00.041041 systemd[1]: session-17.scope: Deactivated successfully. Mar 10 02:03:00.043698 systemd-logind[1537]: Session 17 logged out. Waiting for processes to exit. Mar 10 02:03:00.047098 systemd-logind[1537]: Removed session 17. Mar 10 02:03:02.328858 update_engine[1544]: I20260310 02:03:02.298372 1544 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 10 02:03:02.360202 update_engine[1544]: I20260310 02:03:02.350414 1544 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 10 02:03:02.365923 update_engine[1544]: I20260310 02:03:02.365860 1544 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 10 02:03:02.385174 update_engine[1544]: E20260310 02:03:02.385005 1544 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 10 02:03:02.385174 update_engine[1544]: I20260310 02:03:02.385126 1544 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 10 02:03:05.109957 systemd[1]: Started sshd@17-10.0.0.74:22-10.0.0.1:33608.service - OpenSSH per-connection server daemon (10.0.0.1:33608). Mar 10 02:03:05.435137 sshd[5996]: Accepted publickey for core from 10.0.0.1 port 33608 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:05.437190 sshd-session[5996]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:05.456282 systemd-logind[1537]: New session 18 of user core. Mar 10 02:03:05.463234 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 10 02:03:05.888223 sshd[5999]: Connection closed by 10.0.0.1 port 33608 Mar 10 02:03:05.889773 sshd-session[5996]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:05.915966 systemd[1]: sshd@17-10.0.0.74:22-10.0.0.1:33608.service: Deactivated successfully. Mar 10 02:03:05.916834 systemd-logind[1537]: Session 18 logged out. Waiting for processes to exit. Mar 10 02:03:05.925075 systemd[1]: session-18.scope: Deactivated successfully. Mar 10 02:03:05.929774 systemd-logind[1537]: Removed session 18. Mar 10 02:03:10.938758 systemd[1]: Started sshd@18-10.0.0.74:22-10.0.0.1:51998.service - OpenSSH per-connection server daemon (10.0.0.1:51998). Mar 10 02:03:11.271882 sshd[6089]: Accepted publickey for core from 10.0.0.1 port 51998 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:11.292622 sshd-session[6089]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:11.308668 systemd-logind[1537]: New session 19 of user core. Mar 10 02:03:11.319187 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 10 02:03:11.854211 sshd[6092]: Connection closed by 10.0.0.1 port 51998 Mar 10 02:03:11.865378 sshd-session[6089]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:11.913221 systemd[1]: sshd@18-10.0.0.74:22-10.0.0.1:51998.service: Deactivated successfully. Mar 10 02:03:11.920172 systemd[1]: session-19.scope: Deactivated successfully. Mar 10 02:03:11.923858 systemd-logind[1537]: Session 19 logged out. Waiting for processes to exit. Mar 10 02:03:11.929214 systemd-logind[1537]: Removed session 19. Mar 10 02:03:12.277401 update_engine[1544]: I20260310 02:03:12.277168 1544 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 10 02:03:12.300756 update_engine[1544]: I20260310 02:03:12.277829 1544 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 10 02:03:12.300756 update_engine[1544]: I20260310 02:03:12.278786 1544 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 10 02:03:12.308768 update_engine[1544]: E20260310 02:03:12.308290 1544 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 10 02:03:12.308768 update_engine[1544]: I20260310 02:03:12.308441 1544 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 10 02:03:17.096025 systemd[1]: Started sshd@19-10.0.0.74:22-10.0.0.1:52014.service - OpenSSH per-connection server daemon (10.0.0.1:52014). Mar 10 02:03:17.423557 sshd[6107]: Accepted publickey for core from 10.0.0.1 port 52014 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:17.435740 sshd-session[6107]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:17.501693 systemd-logind[1537]: New session 20 of user core. Mar 10 02:03:17.510050 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 10 02:03:18.133272 sshd[6110]: Connection closed by 10.0.0.1 port 52014 Mar 10 02:03:18.130590 sshd-session[6107]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:18.138301 systemd[1]: sshd@19-10.0.0.74:22-10.0.0.1:52014.service: Deactivated successfully. Mar 10 02:03:18.150648 systemd[1]: session-20.scope: Deactivated successfully. Mar 10 02:03:18.161250 systemd-logind[1537]: Session 20 logged out. Waiting for processes to exit. Mar 10 02:03:18.167347 systemd-logind[1537]: Removed session 20. Mar 10 02:03:22.285636 update_engine[1544]: I20260310 02:03:22.285528 1544 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 10 02:03:22.286014 update_engine[1544]: I20260310 02:03:22.285652 1544 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 10 02:03:22.286358 update_engine[1544]: I20260310 02:03:22.286217 1544 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 10 02:03:22.304803 update_engine[1544]: E20260310 02:03:22.304685 1544 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 10 02:03:22.304961 update_engine[1544]: I20260310 02:03:22.304872 1544 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 10 02:03:22.304961 update_engine[1544]: I20260310 02:03:22.304898 1544 omaha_request_action.cc:617] Omaha request response: Mar 10 02:03:22.306101 update_engine[1544]: E20260310 02:03:22.305032 1544 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 10 02:03:22.310126 update_engine[1544]: I20260310 02:03:22.309841 1544 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 10 02:03:22.310126 update_engine[1544]: I20260310 02:03:22.309922 1544 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 10 02:03:22.310126 update_engine[1544]: I20260310 02:03:22.309938 1544 update_attempter.cc:306] Processing Done. Mar 10 02:03:22.313397 update_engine[1544]: E20260310 02:03:22.312731 1544 update_attempter.cc:619] Update failed. Mar 10 02:03:22.313397 update_engine[1544]: I20260310 02:03:22.312759 1544 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 10 02:03:22.313397 update_engine[1544]: I20260310 02:03:22.312767 1544 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 10 02:03:22.313397 update_engine[1544]: I20260310 02:03:22.312774 1544 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 10 02:03:22.313397 update_engine[1544]: I20260310 02:03:22.312867 1544 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 10 02:03:22.313397 update_engine[1544]: I20260310 02:03:22.312894 1544 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 10 02:03:22.313397 update_engine[1544]: I20260310 02:03:22.312901 1544 omaha_request_action.cc:272] Request: Mar 10 02:03:22.313397 update_engine[1544]: Mar 10 02:03:22.313397 update_engine[1544]: Mar 10 02:03:22.313397 update_engine[1544]: Mar 10 02:03:22.313397 update_engine[1544]: Mar 10 02:03:22.313397 update_engine[1544]: Mar 10 02:03:22.313397 update_engine[1544]: Mar 10 02:03:22.313397 update_engine[1544]: I20260310 02:03:22.312908 1544 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 10 02:03:22.313397 update_engine[1544]: I20260310 02:03:22.312937 1544 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 10 02:03:22.313397 update_engine[1544]: I20260310 02:03:22.313359 1544 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 10 02:03:22.313961 locksmithd[1585]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 10 02:03:22.334908 update_engine[1544]: E20260310 02:03:22.334765 1544 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 10 02:03:22.334908 update_engine[1544]: I20260310 02:03:22.334887 1544 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 10 02:03:22.334908 update_engine[1544]: I20260310 02:03:22.334904 1544 omaha_request_action.cc:617] Omaha request response: Mar 10 02:03:22.334908 update_engine[1544]: I20260310 02:03:22.334918 1544 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 10 02:03:22.334908 update_engine[1544]: I20260310 02:03:22.334928 1544 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 10 02:03:22.335270 update_engine[1544]: I20260310 02:03:22.334937 1544 update_attempter.cc:306] Processing Done. Mar 10 02:03:22.335270 update_engine[1544]: I20260310 02:03:22.334949 1544 update_attempter.cc:310] Error event sent. Mar 10 02:03:22.335270 update_engine[1544]: I20260310 02:03:22.334961 1544 update_check_scheduler.cc:74] Next update check in 46m57s Mar 10 02:03:22.340348 locksmithd[1585]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 10 02:03:23.166417 systemd[1]: Started sshd@20-10.0.0.74:22-10.0.0.1:58040.service - OpenSSH per-connection server daemon (10.0.0.1:58040). Mar 10 02:03:23.469720 sshd[6149]: Accepted publickey for core from 10.0.0.1 port 58040 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:23.467959 sshd-session[6149]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:23.508275 systemd-logind[1537]: New session 21 of user core. Mar 10 02:03:23.526175 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 10 02:03:23.885424 sshd[6152]: Connection closed by 10.0.0.1 port 58040 Mar 10 02:03:23.886988 sshd-session[6149]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:23.902010 systemd[1]: sshd@20-10.0.0.74:22-10.0.0.1:58040.service: Deactivated successfully. Mar 10 02:03:23.907122 systemd[1]: session-21.scope: Deactivated successfully. Mar 10 02:03:23.910889 systemd-logind[1537]: Session 21 logged out. Waiting for processes to exit. Mar 10 02:03:23.912951 systemd-logind[1537]: Removed session 21. Mar 10 02:03:28.936092 systemd[1]: Started sshd@21-10.0.0.74:22-10.0.0.1:58048.service - OpenSSH per-connection server daemon (10.0.0.1:58048). Mar 10 02:03:29.061559 sshd[6169]: Accepted publickey for core from 10.0.0.1 port 58048 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:29.066388 sshd-session[6169]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:29.087851 systemd-logind[1537]: New session 22 of user core. Mar 10 02:03:29.098742 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 10 02:03:29.490814 sshd[6172]: Connection closed by 10.0.0.1 port 58048 Mar 10 02:03:29.489253 sshd-session[6169]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:29.523804 systemd[1]: sshd@21-10.0.0.74:22-10.0.0.1:58048.service: Deactivated successfully. Mar 10 02:03:29.527826 systemd[1]: session-22.scope: Deactivated successfully. Mar 10 02:03:29.530220 systemd-logind[1537]: Session 22 logged out. Waiting for processes to exit. Mar 10 02:03:29.532399 systemd-logind[1537]: Removed session 22. Mar 10 02:03:34.519816 systemd[1]: Started sshd@22-10.0.0.74:22-10.0.0.1:58318.service - OpenSSH per-connection server daemon (10.0.0.1:58318). Mar 10 02:03:34.758354 sshd[6215]: Accepted publickey for core from 10.0.0.1 port 58318 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:34.759345 sshd-session[6215]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:34.787847 systemd-logind[1537]: New session 23 of user core. Mar 10 02:03:34.803178 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 10 02:03:35.088136 sshd[6218]: Connection closed by 10.0.0.1 port 58318 Mar 10 02:03:35.089103 sshd-session[6215]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:35.097918 systemd[1]: sshd@22-10.0.0.74:22-10.0.0.1:58318.service: Deactivated successfully. Mar 10 02:03:35.103073 systemd[1]: session-23.scope: Deactivated successfully. Mar 10 02:03:35.108391 systemd-logind[1537]: Session 23 logged out. Waiting for processes to exit. Mar 10 02:03:35.113107 systemd-logind[1537]: Removed session 23. Mar 10 02:03:40.132041 systemd[1]: Started sshd@23-10.0.0.74:22-10.0.0.1:37354.service - OpenSSH per-connection server daemon (10.0.0.1:37354). Mar 10 02:03:40.306729 sshd[6232]: Accepted publickey for core from 10.0.0.1 port 37354 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:40.311358 sshd-session[6232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:40.333953 systemd-logind[1537]: New session 24 of user core. Mar 10 02:03:40.344719 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 10 02:03:40.947828 sshd[6236]: Connection closed by 10.0.0.1 port 37354 Mar 10 02:03:40.949329 sshd-session[6232]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:40.964709 systemd[1]: Started sshd@24-10.0.0.74:22-10.0.0.1:37358.service - OpenSSH per-connection server daemon (10.0.0.1:37358). Mar 10 02:03:41.003010 systemd[1]: sshd@23-10.0.0.74:22-10.0.0.1:37354.service: Deactivated successfully. Mar 10 02:03:41.008519 systemd[1]: session-24.scope: Deactivated successfully. Mar 10 02:03:41.018954 systemd-logind[1537]: Session 24 logged out. Waiting for processes to exit. Mar 10 02:03:41.031812 systemd-logind[1537]: Removed session 24. Mar 10 02:03:41.237725 sshd[6280]: Accepted publickey for core from 10.0.0.1 port 37358 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:41.244753 sshd-session[6280]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:41.258166 systemd-logind[1537]: New session 25 of user core. Mar 10 02:03:41.278074 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 10 02:03:41.741954 sshd[6287]: Connection closed by 10.0.0.1 port 37358 Mar 10 02:03:41.746766 sshd-session[6280]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:41.762141 systemd[1]: Started sshd@25-10.0.0.74:22-10.0.0.1:37370.service - OpenSSH per-connection server daemon (10.0.0.1:37370). Mar 10 02:03:41.763004 systemd[1]: sshd@24-10.0.0.74:22-10.0.0.1:37358.service: Deactivated successfully. Mar 10 02:03:41.776644 systemd[1]: session-25.scope: Deactivated successfully. Mar 10 02:03:41.778105 systemd-logind[1537]: Session 25 logged out. Waiting for processes to exit. Mar 10 02:03:41.789977 systemd-logind[1537]: Removed session 25. Mar 10 02:03:41.925393 sshd[6304]: Accepted publickey for core from 10.0.0.1 port 37370 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:41.932991 sshd-session[6304]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:41.954912 systemd-logind[1537]: New session 26 of user core. Mar 10 02:03:41.972824 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 10 02:03:42.185151 sshd[6311]: Connection closed by 10.0.0.1 port 37370 Mar 10 02:03:42.186121 sshd-session[6304]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:42.196581 systemd[1]: sshd@25-10.0.0.74:22-10.0.0.1:37370.service: Deactivated successfully. Mar 10 02:03:42.200121 systemd[1]: session-26.scope: Deactivated successfully. Mar 10 02:03:42.203331 systemd-logind[1537]: Session 26 logged out. Waiting for processes to exit. Mar 10 02:03:42.207575 systemd-logind[1537]: Removed session 26. Mar 10 02:03:47.245196 systemd[1]: Started sshd@26-10.0.0.74:22-10.0.0.1:37372.service - OpenSSH per-connection server daemon (10.0.0.1:37372). Mar 10 02:03:47.376182 sshd[6325]: Accepted publickey for core from 10.0.0.1 port 37372 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:47.386882 sshd-session[6325]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:47.445620 systemd-logind[1537]: New session 27 of user core. Mar 10 02:03:47.464686 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 10 02:03:47.796335 sshd[6328]: Connection closed by 10.0.0.1 port 37372 Mar 10 02:03:47.797004 sshd-session[6325]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:47.809363 systemd[1]: sshd@26-10.0.0.74:22-10.0.0.1:37372.service: Deactivated successfully. Mar 10 02:03:47.814802 systemd[1]: session-27.scope: Deactivated successfully. Mar 10 02:03:47.820866 systemd-logind[1537]: Session 27 logged out. Waiting for processes to exit. Mar 10 02:03:47.824241 systemd-logind[1537]: Removed session 27. Mar 10 02:03:52.824408 systemd[1]: Started sshd@27-10.0.0.74:22-10.0.0.1:40250.service - OpenSSH per-connection server daemon (10.0.0.1:40250). Mar 10 02:03:52.901218 sshd[6366]: Accepted publickey for core from 10.0.0.1 port 40250 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:52.907346 sshd-session[6366]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:52.920538 systemd-logind[1537]: New session 28 of user core. Mar 10 02:03:52.926756 systemd[1]: Started session-28.scope - Session 28 of User core. Mar 10 02:03:53.088508 sshd[6369]: Connection closed by 10.0.0.1 port 40250 Mar 10 02:03:53.087986 sshd-session[6366]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:53.101158 systemd[1]: sshd@27-10.0.0.74:22-10.0.0.1:40250.service: Deactivated successfully. Mar 10 02:03:53.106647 systemd[1]: session-28.scope: Deactivated successfully. Mar 10 02:03:53.110358 systemd-logind[1537]: Session 28 logged out. Waiting for processes to exit. Mar 10 02:03:53.113207 systemd-logind[1537]: Removed session 28. Mar 10 02:03:55.329434 containerd[1555]: time="2026-03-10T02:03:55.315407028Z" level=warning msg="container event discarded" container=564936535e353ff643864d3414f9936851d3088f2b11c01dac01bb7e838e4525 type=CONTAINER_CREATED_EVENT Mar 10 02:03:55.366239 containerd[1555]: time="2026-03-10T02:03:55.366162880Z" level=warning msg="container event discarded" container=564936535e353ff643864d3414f9936851d3088f2b11c01dac01bb7e838e4525 type=CONTAINER_STARTED_EVENT Mar 10 02:03:55.395569 containerd[1555]: time="2026-03-10T02:03:55.395412247Z" level=warning msg="container event discarded" container=cf7f0c0955503763fe527a42a9d3b435c706951bfe0958401972d71ab517db59 type=CONTAINER_CREATED_EVENT Mar 10 02:03:55.395569 containerd[1555]: time="2026-03-10T02:03:55.395530252Z" level=warning msg="container event discarded" container=cf7f0c0955503763fe527a42a9d3b435c706951bfe0958401972d71ab517db59 type=CONTAINER_STARTED_EVENT Mar 10 02:03:55.413971 containerd[1555]: time="2026-03-10T02:03:55.413880193Z" level=warning msg="container event discarded" container=6265293fefd8ec4ede52e937f569c2634da75b6da0e5fe4d79c006d2f1de913b type=CONTAINER_CREATED_EVENT Mar 10 02:03:55.413971 containerd[1555]: time="2026-03-10T02:03:55.413938163Z" level=warning msg="container event discarded" container=6265293fefd8ec4ede52e937f569c2634da75b6da0e5fe4d79c006d2f1de913b type=CONTAINER_STARTED_EVENT Mar 10 02:03:55.566849 containerd[1555]: time="2026-03-10T02:03:55.566760210Z" level=warning msg="container event discarded" container=2a1199f80d2215f98eef4b67fa3141b339186fb05a99f71c090340200953b2b7 type=CONTAINER_CREATED_EVENT Mar 10 02:03:55.612549 containerd[1555]: time="2026-03-10T02:03:55.612252755Z" level=warning msg="container event discarded" container=82a679beba372a44d5940cce62414106e12d476304ab09f74dfd29de7c7d608d type=CONTAINER_CREATED_EVENT Mar 10 02:03:55.612549 containerd[1555]: time="2026-03-10T02:03:55.612315284Z" level=warning msg="container event discarded" container=d8677cf96fc993f059a6de7516d0bb68d2573036f338097d5e50f2ffb88bee1d type=CONTAINER_CREATED_EVENT Mar 10 02:03:55.902889 containerd[1555]: time="2026-03-10T02:03:55.902439778Z" level=warning msg="container event discarded" container=2a1199f80d2215f98eef4b67fa3141b339186fb05a99f71c090340200953b2b7 type=CONTAINER_STARTED_EVENT Mar 10 02:03:55.929499 containerd[1555]: time="2026-03-10T02:03:55.928436493Z" level=warning msg="container event discarded" container=d8677cf96fc993f059a6de7516d0bb68d2573036f338097d5e50f2ffb88bee1d type=CONTAINER_STARTED_EVENT Mar 10 02:03:55.974037 containerd[1555]: time="2026-03-10T02:03:55.973941516Z" level=warning msg="container event discarded" container=82a679beba372a44d5940cce62414106e12d476304ab09f74dfd29de7c7d608d type=CONTAINER_STARTED_EVENT Mar 10 02:03:58.111219 systemd[1]: Started sshd@28-10.0.0.74:22-10.0.0.1:40252.service - OpenSSH per-connection server daemon (10.0.0.1:40252). Mar 10 02:03:58.218662 sshd[6383]: Accepted publickey for core from 10.0.0.1 port 40252 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:03:58.221246 sshd-session[6383]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:03:58.236710 systemd-logind[1537]: New session 29 of user core. Mar 10 02:03:58.260595 systemd[1]: Started session-29.scope - Session 29 of User core. Mar 10 02:03:58.638683 sshd[6386]: Connection closed by 10.0.0.1 port 40252 Mar 10 02:03:58.639996 sshd-session[6383]: pam_unix(sshd:session): session closed for user core Mar 10 02:03:58.651073 systemd[1]: sshd@28-10.0.0.74:22-10.0.0.1:40252.service: Deactivated successfully. Mar 10 02:03:58.658337 systemd[1]: session-29.scope: Deactivated successfully. Mar 10 02:03:58.663546 systemd-logind[1537]: Session 29 logged out. Waiting for processes to exit. Mar 10 02:03:58.667590 systemd-logind[1537]: Removed session 29. Mar 10 02:04:03.687865 systemd[1]: Started sshd@29-10.0.0.74:22-10.0.0.1:60286.service - OpenSSH per-connection server daemon (10.0.0.1:60286). Mar 10 02:04:03.895976 sshd[6436]: Accepted publickey for core from 10.0.0.1 port 60286 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:03.903441 sshd-session[6436]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:03.933530 systemd-logind[1537]: New session 30 of user core. Mar 10 02:04:03.948986 systemd[1]: Started session-30.scope - Session 30 of User core. Mar 10 02:04:04.356040 sshd[6439]: Connection closed by 10.0.0.1 port 60286 Mar 10 02:04:04.360180 sshd-session[6436]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:04.376545 systemd[1]: sshd@29-10.0.0.74:22-10.0.0.1:60286.service: Deactivated successfully. Mar 10 02:04:04.388861 systemd[1]: session-30.scope: Deactivated successfully. Mar 10 02:04:04.406827 systemd-logind[1537]: Session 30 logged out. Waiting for processes to exit. Mar 10 02:04:04.424006 systemd[1]: Started sshd@30-10.0.0.74:22-10.0.0.1:60302.service - OpenSSH per-connection server daemon (10.0.0.1:60302). Mar 10 02:04:04.432403 systemd-logind[1537]: Removed session 30. Mar 10 02:04:04.629884 sshd[6453]: Accepted publickey for core from 10.0.0.1 port 60302 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:04.646224 sshd-session[6453]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:04.711116 systemd-logind[1537]: New session 31 of user core. Mar 10 02:04:04.739416 systemd[1]: Started session-31.scope - Session 31 of User core. Mar 10 02:04:05.894673 sshd[6456]: Connection closed by 10.0.0.1 port 60302 Mar 10 02:04:05.897732 sshd-session[6453]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:05.934314 systemd[1]: sshd@30-10.0.0.74:22-10.0.0.1:60302.service: Deactivated successfully. Mar 10 02:04:05.939945 systemd[1]: session-31.scope: Deactivated successfully. Mar 10 02:04:05.943362 systemd-logind[1537]: Session 31 logged out. Waiting for processes to exit. Mar 10 02:04:05.960552 systemd[1]: Started sshd@31-10.0.0.74:22-10.0.0.1:60308.service - OpenSSH per-connection server daemon (10.0.0.1:60308). Mar 10 02:04:05.963981 systemd-logind[1537]: Removed session 31. Mar 10 02:04:06.266157 sshd[6467]: Accepted publickey for core from 10.0.0.1 port 60308 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:06.268651 sshd-session[6467]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:06.285403 systemd-logind[1537]: New session 32 of user core. Mar 10 02:04:06.298301 systemd[1]: Started session-32.scope - Session 32 of User core. Mar 10 02:04:08.204425 sshd[6470]: Connection closed by 10.0.0.1 port 60308 Mar 10 02:04:08.200112 sshd-session[6467]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:08.253303 systemd[1]: sshd@31-10.0.0.74:22-10.0.0.1:60308.service: Deactivated successfully. Mar 10 02:04:08.266959 systemd[1]: session-32.scope: Deactivated successfully. Mar 10 02:04:08.275531 systemd-logind[1537]: Session 32 logged out. Waiting for processes to exit. Mar 10 02:04:08.291510 systemd[1]: Started sshd@32-10.0.0.74:22-10.0.0.1:60320.service - OpenSSH per-connection server daemon (10.0.0.1:60320). Mar 10 02:04:08.304080 systemd-logind[1537]: Removed session 32. Mar 10 02:04:08.563148 sshd[6501]: Accepted publickey for core from 10.0.0.1 port 60320 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:08.561294 sshd-session[6501]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:08.601636 systemd-logind[1537]: New session 33 of user core. Mar 10 02:04:08.605653 systemd[1]: Started session-33.scope - Session 33 of User core. Mar 10 02:04:10.128553 sshd[6504]: Connection closed by 10.0.0.1 port 60320 Mar 10 02:04:10.127224 sshd-session[6501]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:10.168360 systemd[1]: sshd@32-10.0.0.74:22-10.0.0.1:60320.service: Deactivated successfully. Mar 10 02:04:10.185666 systemd[1]: session-33.scope: Deactivated successfully. Mar 10 02:04:10.196009 systemd-logind[1537]: Session 33 logged out. Waiting for processes to exit. Mar 10 02:04:10.223952 systemd[1]: Started sshd@33-10.0.0.74:22-10.0.0.1:54554.service - OpenSSH per-connection server daemon (10.0.0.1:54554). Mar 10 02:04:10.231647 systemd-logind[1537]: Removed session 33. Mar 10 02:04:10.468235 sshd[6564]: Accepted publickey for core from 10.0.0.1 port 54554 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:10.464794 sshd-session[6564]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:10.507986 systemd-logind[1537]: New session 34 of user core. Mar 10 02:04:10.519362 systemd[1]: Started session-34.scope - Session 34 of User core. Mar 10 02:04:11.036081 sshd[6567]: Connection closed by 10.0.0.1 port 54554 Mar 10 02:04:11.038304 sshd-session[6564]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:11.070850 systemd[1]: sshd@33-10.0.0.74:22-10.0.0.1:54554.service: Deactivated successfully. Mar 10 02:04:11.088444 systemd[1]: session-34.scope: Deactivated successfully. Mar 10 02:04:11.103300 systemd-logind[1537]: Session 34 logged out. Waiting for processes to exit. Mar 10 02:04:11.116579 systemd-logind[1537]: Removed session 34. Mar 10 02:04:16.089732 systemd[1]: Started sshd@34-10.0.0.74:22-10.0.0.1:54558.service - OpenSSH per-connection server daemon (10.0.0.1:54558). Mar 10 02:04:16.415678 sshd[6606]: Accepted publickey for core from 10.0.0.1 port 54558 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:16.419444 sshd-session[6606]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:16.446741 systemd-logind[1537]: New session 35 of user core. Mar 10 02:04:16.473132 systemd[1]: Started session-35.scope - Session 35 of User core. Mar 10 02:04:16.870862 sshd[6609]: Connection closed by 10.0.0.1 port 54558 Mar 10 02:04:16.872849 sshd-session[6606]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:16.888691 systemd[1]: sshd@34-10.0.0.74:22-10.0.0.1:54558.service: Deactivated successfully. Mar 10 02:04:16.897668 systemd[1]: session-35.scope: Deactivated successfully. Mar 10 02:04:16.907941 systemd-logind[1537]: Session 35 logged out. Waiting for processes to exit. Mar 10 02:04:16.915851 systemd-logind[1537]: Removed session 35. Mar 10 02:04:21.913018 systemd[1]: Started sshd@35-10.0.0.74:22-10.0.0.1:39888.service - OpenSSH per-connection server daemon (10.0.0.1:39888). Mar 10 02:04:22.086096 sshd[6661]: Accepted publickey for core from 10.0.0.1 port 39888 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:22.085054 sshd-session[6661]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:22.103609 systemd-logind[1537]: New session 36 of user core. Mar 10 02:04:22.144372 systemd[1]: Started session-36.scope - Session 36 of User core. Mar 10 02:04:22.480701 sshd[6664]: Connection closed by 10.0.0.1 port 39888 Mar 10 02:04:22.477709 sshd-session[6661]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:22.519368 systemd[1]: sshd@35-10.0.0.74:22-10.0.0.1:39888.service: Deactivated successfully. Mar 10 02:04:22.526264 systemd-logind[1537]: Session 36 logged out. Waiting for processes to exit. Mar 10 02:04:22.532200 systemd[1]: session-36.scope: Deactivated successfully. Mar 10 02:04:22.540185 systemd-logind[1537]: Removed session 36. Mar 10 02:04:27.523826 systemd[1]: Started sshd@36-10.0.0.74:22-10.0.0.1:39892.service - OpenSSH per-connection server daemon (10.0.0.1:39892). Mar 10 02:04:27.693744 sshd[6695]: Accepted publickey for core from 10.0.0.1 port 39892 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:27.699148 sshd-session[6695]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:27.718020 systemd-logind[1537]: New session 37 of user core. Mar 10 02:04:27.742158 systemd[1]: Started session-37.scope - Session 37 of User core. Mar 10 02:04:28.144046 sshd[6698]: Connection closed by 10.0.0.1 port 39892 Mar 10 02:04:28.149926 sshd-session[6695]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:28.171952 systemd[1]: sshd@36-10.0.0.74:22-10.0.0.1:39892.service: Deactivated successfully. Mar 10 02:04:28.180397 systemd[1]: session-37.scope: Deactivated successfully. Mar 10 02:04:28.195998 systemd-logind[1537]: Session 37 logged out. Waiting for processes to exit. Mar 10 02:04:28.206678 systemd-logind[1537]: Removed session 37. Mar 10 02:04:30.205956 containerd[1555]: time="2026-03-10T02:04:30.205667986Z" level=warning msg="container event discarded" container=b5351794f7922dad00c59a201d42fb99c9cc7c588d1e54490f74f20233332f0d type=CONTAINER_CREATED_EVENT Mar 10 02:04:30.205956 containerd[1555]: time="2026-03-10T02:04:30.205836717Z" level=warning msg="container event discarded" container=b5351794f7922dad00c59a201d42fb99c9cc7c588d1e54490f74f20233332f0d type=CONTAINER_STARTED_EVENT Mar 10 02:04:31.115672 containerd[1555]: time="2026-03-10T02:04:31.115520307Z" level=warning msg="container event discarded" container=b9bf8657e4302dca28021ef5dcf078a54db24ffe8555590cab115730cc07f763 type=CONTAINER_CREATED_EVENT Mar 10 02:04:32.270343 containerd[1555]: time="2026-03-10T02:04:32.270145087Z" level=warning msg="container event discarded" container=b9bf8657e4302dca28021ef5dcf078a54db24ffe8555590cab115730cc07f763 type=CONTAINER_STARTED_EVENT Mar 10 02:04:33.179144 systemd[1]: Started sshd@37-10.0.0.74:22-10.0.0.1:52064.service - OpenSSH per-connection server daemon (10.0.0.1:52064). Mar 10 02:04:33.320974 sshd[6738]: Accepted publickey for core from 10.0.0.1 port 52064 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:33.326058 sshd-session[6738]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:33.346131 systemd-logind[1537]: New session 38 of user core. Mar 10 02:04:33.357834 systemd[1]: Started session-38.scope - Session 38 of User core. Mar 10 02:04:33.459602 containerd[1555]: time="2026-03-10T02:04:33.459361708Z" level=warning msg="container event discarded" container=19f257c85cd2e21b18347a0434c9360cc1edbecba0fa88ebb5cf7a0ac3e4352a type=CONTAINER_CREATED_EVENT Mar 10 02:04:33.459602 containerd[1555]: time="2026-03-10T02:04:33.459546269Z" level=warning msg="container event discarded" container=19f257c85cd2e21b18347a0434c9360cc1edbecba0fa88ebb5cf7a0ac3e4352a type=CONTAINER_STARTED_EVENT Mar 10 02:04:33.571558 sshd[6741]: Connection closed by 10.0.0.1 port 52064 Mar 10 02:04:33.573375 sshd-session[6738]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:33.589562 systemd[1]: sshd@37-10.0.0.74:22-10.0.0.1:52064.service: Deactivated successfully. Mar 10 02:04:33.596116 systemd[1]: session-38.scope: Deactivated successfully. Mar 10 02:04:33.597985 systemd-logind[1537]: Session 38 logged out. Waiting for processes to exit. Mar 10 02:04:33.601014 systemd-logind[1537]: Removed session 38. Mar 10 02:04:38.618594 systemd[1]: Started sshd@38-10.0.0.74:22-10.0.0.1:52074.service - OpenSSH per-connection server daemon (10.0.0.1:52074). Mar 10 02:04:38.781592 sshd[6756]: Accepted publickey for core from 10.0.0.1 port 52074 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:38.785896 sshd-session[6756]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:38.809557 systemd-logind[1537]: New session 39 of user core. Mar 10 02:04:38.815895 systemd[1]: Started session-39.scope - Session 39 of User core. Mar 10 02:04:39.069799 sshd[6759]: Connection closed by 10.0.0.1 port 52074 Mar 10 02:04:39.069343 sshd-session[6756]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:39.078612 systemd[1]: sshd@38-10.0.0.74:22-10.0.0.1:52074.service: Deactivated successfully. Mar 10 02:04:39.082889 systemd[1]: session-39.scope: Deactivated successfully. Mar 10 02:04:39.087155 systemd-logind[1537]: Session 39 logged out. Waiting for processes to exit. Mar 10 02:04:39.096102 systemd-logind[1537]: Removed session 39. Mar 10 02:04:44.104830 systemd[1]: Started sshd@39-10.0.0.74:22-10.0.0.1:50670.service - OpenSSH per-connection server daemon (10.0.0.1:50670). Mar 10 02:04:44.274136 sshd[6801]: Accepted publickey for core from 10.0.0.1 port 50670 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:44.273078 sshd-session[6801]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:44.300332 systemd-logind[1537]: New session 40 of user core. Mar 10 02:04:44.329801 systemd[1]: Started session-40.scope - Session 40 of User core. Mar 10 02:04:44.714158 sshd[6804]: Connection closed by 10.0.0.1 port 50670 Mar 10 02:04:44.715392 sshd-session[6801]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:44.734003 systemd[1]: sshd@39-10.0.0.74:22-10.0.0.1:50670.service: Deactivated successfully. Mar 10 02:04:44.742255 systemd[1]: session-40.scope: Deactivated successfully. Mar 10 02:04:44.745165 systemd-logind[1537]: Session 40 logged out. Waiting for processes to exit. Mar 10 02:04:44.752553 systemd-logind[1537]: Removed session 40. Mar 10 02:04:49.343103 containerd[1555]: time="2026-03-10T02:04:49.342879767Z" level=warning msg="container event discarded" container=85e8362e42561029c9609bad251c667d40a02182eeabc5fbe736273649598a58 type=CONTAINER_CREATED_EVENT Mar 10 02:04:49.762417 systemd[1]: Started sshd@40-10.0.0.74:22-10.0.0.1:50686.service - OpenSSH per-connection server daemon (10.0.0.1:50686). Mar 10 02:04:49.941096 containerd[1555]: time="2026-03-10T02:04:49.941010546Z" level=warning msg="container event discarded" container=85e8362e42561029c9609bad251c667d40a02182eeabc5fbe736273649598a58 type=CONTAINER_STARTED_EVENT Mar 10 02:04:50.024023 sshd[6840]: Accepted publickey for core from 10.0.0.1 port 50686 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:50.046138 sshd-session[6840]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:50.094847 systemd-logind[1537]: New session 41 of user core. Mar 10 02:04:50.117010 systemd[1]: Started session-41.scope - Session 41 of User core. Mar 10 02:04:50.571729 sshd[6843]: Connection closed by 10.0.0.1 port 50686 Mar 10 02:04:50.571365 sshd-session[6840]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:50.595828 systemd[1]: sshd@40-10.0.0.74:22-10.0.0.1:50686.service: Deactivated successfully. Mar 10 02:04:50.607601 systemd[1]: session-41.scope: Deactivated successfully. Mar 10 02:04:50.612914 systemd-logind[1537]: Session 41 logged out. Waiting for processes to exit. Mar 10 02:04:50.621649 systemd-logind[1537]: Removed session 41. Mar 10 02:04:55.637068 systemd[1]: Started sshd@41-10.0.0.74:22-10.0.0.1:59456.service - OpenSSH per-connection server daemon (10.0.0.1:59456). Mar 10 02:04:55.839199 sshd[6857]: Accepted publickey for core from 10.0.0.1 port 59456 ssh2: RSA SHA256:d2FUFdel+KP9pqsSrlp8nTsY/4RJTtu7ZDVkbTKQqjY Mar 10 02:04:55.847864 sshd-session[6857]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 10 02:04:55.905116 systemd-logind[1537]: New session 42 of user core. Mar 10 02:04:55.922200 systemd[1]: Started session-42.scope - Session 42 of User core. Mar 10 02:04:56.349079 sshd[6860]: Connection closed by 10.0.0.1 port 59456 Mar 10 02:04:56.351158 sshd-session[6857]: pam_unix(sshd:session): session closed for user core Mar 10 02:04:56.377156 systemd[1]: sshd@41-10.0.0.74:22-10.0.0.1:59456.service: Deactivated successfully. Mar 10 02:04:56.396922 systemd[1]: session-42.scope: Deactivated successfully. Mar 10 02:04:56.407425 systemd-logind[1537]: Session 42 logged out. Waiting for processes to exit. Mar 10 02:04:56.428214 systemd-logind[1537]: Removed session 42.