Apr 17 02:50:18.280661 kernel: Linux version 6.12.81-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Apr 16 22:00:21 -00 2026 Apr 17 02:50:18.280681 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=f73cf1d40ab12c6181d739932b2133dbe986804f7665fccb580a411e6eed38d9 Apr 17 02:50:18.280688 kernel: BIOS-provided physical RAM map: Apr 17 02:50:18.280694 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Apr 17 02:50:18.280699 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Apr 17 02:50:18.280703 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Apr 17 02:50:18.280708 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Apr 17 02:50:18.280713 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Apr 17 02:50:18.280717 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Apr 17 02:50:18.280723 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Apr 17 02:50:18.280730 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable Apr 17 02:50:18.280737 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Apr 17 02:50:18.280747 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable Apr 17 02:50:18.280754 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Apr 17 02:50:18.280790 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Apr 17 02:50:18.280800 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Apr 17 02:50:18.280807 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce90fff] usable Apr 17 02:50:18.280848 kernel: BIOS-e820: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Apr 17 02:50:18.280857 kernel: BIOS-e820: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Apr 17 02:50:18.280865 kernel: BIOS-e820: [mem 0x000000009ce97000-0x000000009cedbfff] usable Apr 17 02:50:18.280874 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Apr 17 02:50:18.280883 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Apr 17 02:50:18.280891 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Apr 17 02:50:18.280899 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Apr 17 02:50:18.280907 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Apr 17 02:50:18.280915 kernel: NX (Execute Disable) protection: active Apr 17 02:50:18.280924 kernel: APIC: Static calls initialized Apr 17 02:50:18.280933 kernel: e820: update [mem 0x9b31e018-0x9b327c57] usable ==> usable Apr 17 02:50:18.280945 kernel: e820: update [mem 0x9b2e1018-0x9b31de57] usable ==> usable Apr 17 02:50:18.280954 kernel: extended physical RAM map: Apr 17 02:50:18.280962 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Apr 17 02:50:18.280971 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Apr 17 02:50:18.280979 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Apr 17 02:50:18.280987 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Apr 17 02:50:18.280996 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Apr 17 02:50:18.281004 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Apr 17 02:50:18.281013 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Apr 17 02:50:18.281023 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b2e1017] usable Apr 17 02:50:18.281032 kernel: reserve setup_data: [mem 0x000000009b2e1018-0x000000009b31de57] usable Apr 17 02:50:18.281042 kernel: reserve setup_data: [mem 0x000000009b31de58-0x000000009b31e017] usable Apr 17 02:50:18.281054 kernel: reserve setup_data: [mem 0x000000009b31e018-0x000000009b327c57] usable Apr 17 02:50:18.281062 kernel: reserve setup_data: [mem 0x000000009b327c58-0x000000009bd3efff] usable Apr 17 02:50:18.281071 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Apr 17 02:50:18.281079 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable Apr 17 02:50:18.281091 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Apr 17 02:50:18.281101 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Apr 17 02:50:18.281110 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Apr 17 02:50:18.281118 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce90fff] usable Apr 17 02:50:18.281127 kernel: reserve setup_data: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Apr 17 02:50:18.281136 kernel: reserve setup_data: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Apr 17 02:50:18.281144 kernel: reserve setup_data: [mem 0x000000009ce97000-0x000000009cedbfff] usable Apr 17 02:50:18.281153 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Apr 17 02:50:18.281161 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Apr 17 02:50:18.281170 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Apr 17 02:50:18.281178 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Apr 17 02:50:18.281189 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Apr 17 02:50:18.281198 kernel: efi: EFI v2.7 by EDK II Apr 17 02:50:18.281207 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b9e4198 RNG=0x9cb73018 Apr 17 02:50:18.281216 kernel: random: crng init done Apr 17 02:50:18.281225 kernel: efi: Remove mem151: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Apr 17 02:50:18.281234 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Apr 17 02:50:18.281243 kernel: secureboot: Secure boot disabled Apr 17 02:50:18.281252 kernel: SMBIOS 2.8 present. Apr 17 02:50:18.281262 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Apr 17 02:50:18.281272 kernel: DMI: Memory slots populated: 1/1 Apr 17 02:50:18.281280 kernel: Hypervisor detected: KVM Apr 17 02:50:18.281289 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x10000000000 Apr 17 02:50:18.281299 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Apr 17 02:50:18.281309 kernel: kvm-clock: using sched offset of 7344772465 cycles Apr 17 02:50:18.281319 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Apr 17 02:50:18.281328 kernel: tsc: Detected 2793.438 MHz processor Apr 17 02:50:18.281333 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Apr 17 02:50:18.281339 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Apr 17 02:50:18.281344 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x10000000000 Apr 17 02:50:18.281349 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Apr 17 02:50:18.281355 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Apr 17 02:50:18.281361 kernel: Using GB pages for direct mapping Apr 17 02:50:18.281367 kernel: ACPI: Early table checksum verification disabled Apr 17 02:50:18.281372 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) Apr 17 02:50:18.281377 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Apr 17 02:50:18.281382 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Apr 17 02:50:18.281387 kernel: ACPI: DSDT 0x000000009CB7A000 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 17 02:50:18.281392 kernel: ACPI: FACS 0x000000009CBDD000 000040 Apr 17 02:50:18.281398 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 17 02:50:18.281403 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 17 02:50:18.281409 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 17 02:50:18.281415 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 17 02:50:18.281420 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) Apr 17 02:50:18.281425 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] Apr 17 02:50:18.281431 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1b9] Apr 17 02:50:18.281436 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] Apr 17 02:50:18.281441 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] Apr 17 02:50:18.281446 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] Apr 17 02:50:18.281453 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] Apr 17 02:50:18.281458 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] Apr 17 02:50:18.281463 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] Apr 17 02:50:18.281471 kernel: No NUMA configuration found Apr 17 02:50:18.281480 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] Apr 17 02:50:18.281488 kernel: NODE_DATA(0) allocated [mem 0x9ce36dc0-0x9ce3dfff] Apr 17 02:50:18.281497 kernel: Zone ranges: Apr 17 02:50:18.281506 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Apr 17 02:50:18.281513 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] Apr 17 02:50:18.281522 kernel: Normal empty Apr 17 02:50:18.281533 kernel: Device empty Apr 17 02:50:18.281542 kernel: Movable zone start for each node Apr 17 02:50:18.281551 kernel: Early memory node ranges Apr 17 02:50:18.281560 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Apr 17 02:50:18.281569 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Apr 17 02:50:18.281576 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Apr 17 02:50:18.281581 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Apr 17 02:50:18.281586 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] Apr 17 02:50:18.281591 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] Apr 17 02:50:18.281598 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce90fff] Apr 17 02:50:18.281603 kernel: node 0: [mem 0x000000009ce97000-0x000000009cedbfff] Apr 17 02:50:18.281608 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] Apr 17 02:50:18.281614 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Apr 17 02:50:18.281619 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Apr 17 02:50:18.281624 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Apr 17 02:50:18.281634 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Apr 17 02:50:18.281643 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Apr 17 02:50:18.281653 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Apr 17 02:50:18.281662 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Apr 17 02:50:18.281671 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Apr 17 02:50:18.281680 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges Apr 17 02:50:18.281691 kernel: ACPI: PM-Timer IO Port: 0x608 Apr 17 02:50:18.281700 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Apr 17 02:50:18.281710 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Apr 17 02:50:18.281780 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Apr 17 02:50:18.281791 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Apr 17 02:50:18.281803 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Apr 17 02:50:18.281812 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Apr 17 02:50:18.281853 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Apr 17 02:50:18.281864 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Apr 17 02:50:18.281873 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Apr 17 02:50:18.281992 kernel: TSC deadline timer available Apr 17 02:50:18.282004 kernel: CPU topo: Max. logical packages: 1 Apr 17 02:50:18.282014 kernel: CPU topo: Max. logical dies: 1 Apr 17 02:50:18.282024 kernel: CPU topo: Max. dies per package: 1 Apr 17 02:50:18.282037 kernel: CPU topo: Max. threads per core: 1 Apr 17 02:50:18.282046 kernel: CPU topo: Num. cores per package: 4 Apr 17 02:50:18.282056 kernel: CPU topo: Num. threads per package: 4 Apr 17 02:50:18.282065 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Apr 17 02:50:18.282075 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Apr 17 02:50:18.282084 kernel: kvm-guest: KVM setup pv remote TLB flush Apr 17 02:50:18.282094 kernel: kvm-guest: setup PV sched yield Apr 17 02:50:18.282103 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices Apr 17 02:50:18.282115 kernel: Booting paravirtualized kernel on KVM Apr 17 02:50:18.282123 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Apr 17 02:50:18.282129 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Apr 17 02:50:18.282135 kernel: percpu: Embedded 60 pages/cpu s207448 r8192 d30120 u524288 Apr 17 02:50:18.282141 kernel: pcpu-alloc: s207448 r8192 d30120 u524288 alloc=1*2097152 Apr 17 02:50:18.282147 kernel: pcpu-alloc: [0] 0 1 2 3 Apr 17 02:50:18.282152 kernel: kvm-guest: PV spinlocks enabled Apr 17 02:50:18.282158 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Apr 17 02:50:18.282165 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=f73cf1d40ab12c6181d739932b2133dbe986804f7665fccb580a411e6eed38d9 Apr 17 02:50:18.282173 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Apr 17 02:50:18.282179 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Apr 17 02:50:18.282184 kernel: Fallback order for Node 0: 0 Apr 17 02:50:18.282190 kernel: Built 1 zonelists, mobility grouping on. Total pages: 641450 Apr 17 02:50:18.282196 kernel: Policy zone: DMA32 Apr 17 02:50:18.282202 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Apr 17 02:50:18.282207 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Apr 17 02:50:18.282213 kernel: ftrace: allocating 40126 entries in 157 pages Apr 17 02:50:18.282219 kernel: ftrace: allocated 157 pages with 5 groups Apr 17 02:50:18.282226 kernel: Dynamic Preempt: voluntary Apr 17 02:50:18.282231 kernel: rcu: Preemptible hierarchical RCU implementation. Apr 17 02:50:18.282238 kernel: rcu: RCU event tracing is enabled. Apr 17 02:50:18.282244 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Apr 17 02:50:18.282250 kernel: Trampoline variant of Tasks RCU enabled. Apr 17 02:50:18.282256 kernel: Rude variant of Tasks RCU enabled. Apr 17 02:50:18.282261 kernel: Tracing variant of Tasks RCU enabled. Apr 17 02:50:18.282267 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Apr 17 02:50:18.282273 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Apr 17 02:50:18.282280 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Apr 17 02:50:18.282286 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Apr 17 02:50:18.282292 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Apr 17 02:50:18.282298 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Apr 17 02:50:18.282303 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Apr 17 02:50:18.282309 kernel: Console: colour dummy device 80x25 Apr 17 02:50:18.282315 kernel: printk: legacy console [ttyS0] enabled Apr 17 02:50:18.282321 kernel: ACPI: Core revision 20240827 Apr 17 02:50:18.282327 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Apr 17 02:50:18.282334 kernel: APIC: Switch to symmetric I/O mode setup Apr 17 02:50:18.282340 kernel: x2apic enabled Apr 17 02:50:18.282346 kernel: APIC: Switched APIC routing to: physical x2apic Apr 17 02:50:18.282351 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Apr 17 02:50:18.282357 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Apr 17 02:50:18.282363 kernel: kvm-guest: setup PV IPIs Apr 17 02:50:18.282369 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Apr 17 02:50:18.282374 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x284409db922, max_idle_ns: 440795228871 ns Apr 17 02:50:18.282380 kernel: Calibrating delay loop (skipped) preset value.. 5586.87 BogoMIPS (lpj=2793438) Apr 17 02:50:18.282387 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Apr 17 02:50:18.282393 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Apr 17 02:50:18.282399 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Apr 17 02:50:18.282404 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Apr 17 02:50:18.282410 kernel: Spectre V2 : Mitigation: Retpolines Apr 17 02:50:18.282416 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Apr 17 02:50:18.282421 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Apr 17 02:50:18.282427 kernel: RETBleed: Vulnerable Apr 17 02:50:18.282433 kernel: Speculative Store Bypass: Vulnerable Apr 17 02:50:18.282440 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Apr 17 02:50:18.282446 kernel: GDS: Unknown: Dependent on hypervisor status Apr 17 02:50:18.282452 kernel: active return thunk: its_return_thunk Apr 17 02:50:18.282457 kernel: ITS: Mitigation: Aligned branch/return thunks Apr 17 02:50:18.282463 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Apr 17 02:50:18.282469 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Apr 17 02:50:18.282474 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Apr 17 02:50:18.282480 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Apr 17 02:50:18.282486 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Apr 17 02:50:18.282493 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Apr 17 02:50:18.282498 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Apr 17 02:50:18.282504 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Apr 17 02:50:18.282510 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Apr 17 02:50:18.282515 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Apr 17 02:50:18.282521 kernel: x86/fpu: Enabled xstate features 0xe7, context size is 2432 bytes, using 'compacted' format. Apr 17 02:50:18.282527 kernel: Freeing SMP alternatives memory: 32K Apr 17 02:50:18.282533 kernel: pid_max: default: 32768 minimum: 301 Apr 17 02:50:18.282538 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Apr 17 02:50:18.282545 kernel: landlock: Up and running. Apr 17 02:50:18.282551 kernel: SELinux: Initializing. Apr 17 02:50:18.282557 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Apr 17 02:50:18.282562 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Apr 17 02:50:18.282568 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8370C CPU @ 2.80GHz (family: 0x6, model: 0x6a, stepping: 0x6) Apr 17 02:50:18.282574 kernel: Performance Events: unsupported p6 CPU model 106 no PMU driver, software events only. Apr 17 02:50:18.282579 kernel: signal: max sigframe size: 3632 Apr 17 02:50:18.282585 kernel: rcu: Hierarchical SRCU implementation. Apr 17 02:50:18.282591 kernel: rcu: Max phase no-delay instances is 400. Apr 17 02:50:18.282598 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Apr 17 02:50:18.282604 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Apr 17 02:50:18.282609 kernel: smp: Bringing up secondary CPUs ... Apr 17 02:50:18.282615 kernel: smpboot: x86: Booting SMP configuration: Apr 17 02:50:18.282621 kernel: .... node #0, CPUs: #1 #2 #3 Apr 17 02:50:18.282627 kernel: smp: Brought up 1 node, 4 CPUs Apr 17 02:50:18.282632 kernel: smpboot: Total of 4 processors activated (22347.50 BogoMIPS) Apr 17 02:50:18.282639 kernel: Memory: 2374696K/2565800K available (14336K kernel code, 2453K rwdata, 26076K rodata, 46216K init, 2532K bss, 185212K reserved, 0K cma-reserved) Apr 17 02:50:18.282644 kernel: devtmpfs: initialized Apr 17 02:50:18.282651 kernel: x86/mm: Memory block size: 128MB Apr 17 02:50:18.282657 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Apr 17 02:50:18.282663 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Apr 17 02:50:18.282668 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Apr 17 02:50:18.282674 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) Apr 17 02:50:18.282680 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce95000-0x9ce96fff] (8192 bytes) Apr 17 02:50:18.282685 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) Apr 17 02:50:18.282691 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Apr 17 02:50:18.282698 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Apr 17 02:50:18.282704 kernel: pinctrl core: initialized pinctrl subsystem Apr 17 02:50:18.282710 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Apr 17 02:50:18.282716 kernel: audit: initializing netlink subsys (disabled) Apr 17 02:50:18.282722 kernel: audit: type=2000 audit(1776394213.071:1): state=initialized audit_enabled=0 res=1 Apr 17 02:50:18.282731 kernel: thermal_sys: Registered thermal governor 'step_wise' Apr 17 02:50:18.282741 kernel: thermal_sys: Registered thermal governor 'user_space' Apr 17 02:50:18.282750 kernel: cpuidle: using governor menu Apr 17 02:50:18.282758 kernel: efi: Freeing EFI boot services memory: 38812K Apr 17 02:50:18.282959 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Apr 17 02:50:18.282970 kernel: dca service started, version 1.12.1 Apr 17 02:50:18.282978 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Apr 17 02:50:18.282986 kernel: PCI: Using configuration type 1 for base access Apr 17 02:50:18.282995 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Apr 17 02:50:18.283005 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Apr 17 02:50:18.283015 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Apr 17 02:50:18.283026 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Apr 17 02:50:18.283036 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Apr 17 02:50:18.283048 kernel: ACPI: Added _OSI(Module Device) Apr 17 02:50:18.283058 kernel: ACPI: Added _OSI(Processor Device) Apr 17 02:50:18.283067 kernel: ACPI: Added _OSI(Processor Aggregator Device) Apr 17 02:50:18.283090 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Apr 17 02:50:18.283101 kernel: ACPI: Interpreter enabled Apr 17 02:50:18.283114 kernel: ACPI: PM: (supports S0 S3 S5) Apr 17 02:50:18.283123 kernel: ACPI: Using IOAPIC for interrupt routing Apr 17 02:50:18.283133 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Apr 17 02:50:18.283143 kernel: PCI: Using E820 reservations for host bridge windows Apr 17 02:50:18.283156 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Apr 17 02:50:18.283167 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Apr 17 02:50:18.283329 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Apr 17 02:50:18.283413 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Apr 17 02:50:18.283487 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Apr 17 02:50:18.283497 kernel: PCI host bridge to bus 0000:00 Apr 17 02:50:18.283578 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Apr 17 02:50:18.283650 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Apr 17 02:50:18.283717 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Apr 17 02:50:18.284023 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] Apr 17 02:50:18.284099 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Apr 17 02:50:18.284166 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] Apr 17 02:50:18.284231 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Apr 17 02:50:18.284332 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Apr 17 02:50:18.284483 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Apr 17 02:50:18.284566 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] Apr 17 02:50:18.284641 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] Apr 17 02:50:18.284718 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Apr 17 02:50:18.284945 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Apr 17 02:50:18.285061 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Apr 17 02:50:18.285148 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] Apr 17 02:50:18.285226 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] Apr 17 02:50:18.285305 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] Apr 17 02:50:18.285393 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Apr 17 02:50:18.285470 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] Apr 17 02:50:18.285543 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] Apr 17 02:50:18.285618 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] Apr 17 02:50:18.285716 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Apr 17 02:50:18.286292 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] Apr 17 02:50:18.286384 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] Apr 17 02:50:18.286465 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] Apr 17 02:50:18.286545 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] Apr 17 02:50:18.286631 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Apr 17 02:50:18.286709 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Apr 17 02:50:18.286869 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Apr 17 02:50:18.286952 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] Apr 17 02:50:18.287029 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] Apr 17 02:50:18.287120 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Apr 17 02:50:18.287199 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] Apr 17 02:50:18.287212 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Apr 17 02:50:18.287223 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Apr 17 02:50:18.287240 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Apr 17 02:50:18.287250 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Apr 17 02:50:18.287261 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Apr 17 02:50:18.287271 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Apr 17 02:50:18.287281 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Apr 17 02:50:18.287293 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Apr 17 02:50:18.287302 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Apr 17 02:50:18.287313 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Apr 17 02:50:18.287324 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Apr 17 02:50:18.287336 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Apr 17 02:50:18.287346 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Apr 17 02:50:18.287357 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Apr 17 02:50:18.287368 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Apr 17 02:50:18.287378 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Apr 17 02:50:18.287389 kernel: iommu: Default domain type: Translated Apr 17 02:50:18.287398 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Apr 17 02:50:18.287408 kernel: efivars: Registered efivars operations Apr 17 02:50:18.287417 kernel: PCI: Using ACPI for IRQ routing Apr 17 02:50:18.287429 kernel: PCI: pci_cache_line_size set to 64 bytes Apr 17 02:50:18.287438 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Apr 17 02:50:18.287448 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Apr 17 02:50:18.287458 kernel: e820: reserve RAM buffer [mem 0x9b2e1018-0x9bffffff] Apr 17 02:50:18.287467 kernel: e820: reserve RAM buffer [mem 0x9b31e018-0x9bffffff] Apr 17 02:50:18.287477 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] Apr 17 02:50:18.287487 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] Apr 17 02:50:18.287497 kernel: e820: reserve RAM buffer [mem 0x9ce91000-0x9fffffff] Apr 17 02:50:18.287508 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] Apr 17 02:50:18.287591 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Apr 17 02:50:18.287671 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Apr 17 02:50:18.287751 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Apr 17 02:50:18.288003 kernel: vgaarb: loaded Apr 17 02:50:18.288020 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Apr 17 02:50:18.288031 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Apr 17 02:50:18.288041 kernel: clocksource: Switched to clocksource kvm-clock Apr 17 02:50:18.288051 kernel: VFS: Disk quotas dquot_6.6.0 Apr 17 02:50:18.288065 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Apr 17 02:50:18.288075 kernel: pnp: PnP ACPI init Apr 17 02:50:18.288194 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved Apr 17 02:50:18.288208 kernel: pnp: PnP ACPI: found 6 devices Apr 17 02:50:18.288219 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Apr 17 02:50:18.288246 kernel: NET: Registered PF_INET protocol family Apr 17 02:50:18.288257 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Apr 17 02:50:18.288266 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Apr 17 02:50:18.289984 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Apr 17 02:50:18.289999 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Apr 17 02:50:18.290009 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Apr 17 02:50:18.290018 kernel: TCP: Hash tables configured (established 32768 bind 32768) Apr 17 02:50:18.290029 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Apr 17 02:50:18.290042 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Apr 17 02:50:18.290053 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Apr 17 02:50:18.290065 kernel: NET: Registered PF_XDP protocol family Apr 17 02:50:18.290177 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window Apr 17 02:50:18.290293 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned Apr 17 02:50:18.290372 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Apr 17 02:50:18.290442 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Apr 17 02:50:18.290511 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Apr 17 02:50:18.290578 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] Apr 17 02:50:18.290641 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Apr 17 02:50:18.290757 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] Apr 17 02:50:18.290795 kernel: PCI: CLS 0 bytes, default 64 Apr 17 02:50:18.290806 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Apr 17 02:50:18.290845 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x284409db922, max_idle_ns: 440795228871 ns Apr 17 02:50:18.290872 kernel: Initialise system trusted keyrings Apr 17 02:50:18.290884 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Apr 17 02:50:18.290896 kernel: Key type asymmetric registered Apr 17 02:50:18.290904 kernel: Asymmetric key parser 'x509' registered Apr 17 02:50:18.290913 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Apr 17 02:50:18.290922 kernel: io scheduler mq-deadline registered Apr 17 02:50:18.290931 kernel: io scheduler kyber registered Apr 17 02:50:18.290940 kernel: io scheduler bfq registered Apr 17 02:50:18.290949 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Apr 17 02:50:18.290959 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Apr 17 02:50:18.290969 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Apr 17 02:50:18.290979 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Apr 17 02:50:18.290990 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Apr 17 02:50:18.291000 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Apr 17 02:50:18.291010 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Apr 17 02:50:18.291019 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Apr 17 02:50:18.291029 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Apr 17 02:50:18.291121 kernel: rtc_cmos 00:04: RTC can wake from S4 Apr 17 02:50:18.291135 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Apr 17 02:50:18.291236 kernel: rtc_cmos 00:04: registered as rtc0 Apr 17 02:50:18.291314 kernel: rtc_cmos 00:04: setting system clock to 2026-04-17T02:50:17 UTC (1776394217) Apr 17 02:50:18.291378 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Apr 17 02:50:18.291391 kernel: intel_pstate: CPU model not supported Apr 17 02:50:18.291401 kernel: efifb: probing for efifb Apr 17 02:50:18.291411 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k Apr 17 02:50:18.291422 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Apr 17 02:50:18.291432 kernel: efifb: scrolling: redraw Apr 17 02:50:18.291442 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Apr 17 02:50:18.291452 kernel: Console: switching to colour frame buffer device 160x50 Apr 17 02:50:18.291465 kernel: fb0: EFI VGA frame buffer device Apr 17 02:50:18.291476 kernel: pstore: Using crash dump compression: deflate Apr 17 02:50:18.291486 kernel: pstore: Registered efi_pstore as persistent store backend Apr 17 02:50:18.291497 kernel: NET: Registered PF_INET6 protocol family Apr 17 02:50:18.291508 kernel: Segment Routing with IPv6 Apr 17 02:50:18.291518 kernel: In-situ OAM (IOAM) with IPv6 Apr 17 02:50:18.291529 kernel: NET: Registered PF_PACKET protocol family Apr 17 02:50:18.291540 kernel: Key type dns_resolver registered Apr 17 02:50:18.291550 kernel: IPI shorthand broadcast: enabled Apr 17 02:50:18.291563 kernel: sched_clock: Marking stable (4395074824, 1067949899)->(5846941741, -383917018) Apr 17 02:50:18.291574 kernel: registered taskstats version 1 Apr 17 02:50:18.291584 kernel: Loading compiled-in X.509 certificates Apr 17 02:50:18.291594 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.81-flatcar: 92f69eed5a22c94634d5240e5e65306547d4ba83' Apr 17 02:50:18.291604 kernel: Demotion targets for Node 0: null Apr 17 02:50:18.291614 kernel: Key type .fscrypt registered Apr 17 02:50:18.291623 kernel: Key type fscrypt-provisioning registered Apr 17 02:50:18.291633 kernel: ima: No TPM chip found, activating TPM-bypass! Apr 17 02:50:18.291644 kernel: ima: Allocated hash algorithm: sha1 Apr 17 02:50:18.291656 kernel: ima: No architecture policies found Apr 17 02:50:18.291667 kernel: clk: Disabling unused clocks Apr 17 02:50:18.291679 kernel: Warning: unable to open an initial console. Apr 17 02:50:18.291689 kernel: Freeing unused kernel image (initmem) memory: 46216K Apr 17 02:50:18.292886 kernel: Write protecting the kernel read-only data: 40960k Apr 17 02:50:18.292909 kernel: Freeing unused kernel image (rodata/data gap) memory: 548K Apr 17 02:50:18.292921 kernel: Run /init as init process Apr 17 02:50:18.292931 kernel: with arguments: Apr 17 02:50:18.292942 kernel: /init Apr 17 02:50:18.292956 kernel: with environment: Apr 17 02:50:18.292965 kernel: HOME=/ Apr 17 02:50:18.292976 kernel: TERM=linux Apr 17 02:50:18.292989 systemd[1]: Successfully made /usr/ read-only. Apr 17 02:50:18.293004 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Apr 17 02:50:18.293016 systemd[1]: Detected virtualization kvm. Apr 17 02:50:18.293027 systemd[1]: Detected architecture x86-64. Apr 17 02:50:18.293037 systemd[1]: Running in initrd. Apr 17 02:50:18.293051 systemd[1]: No hostname configured, using default hostname. Apr 17 02:50:18.293063 systemd[1]: Hostname set to . Apr 17 02:50:18.293075 systemd[1]: Initializing machine ID from VM UUID. Apr 17 02:50:18.293087 systemd[1]: Queued start job for default target initrd.target. Apr 17 02:50:18.293099 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 17 02:50:18.293111 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 17 02:50:18.293124 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Apr 17 02:50:18.293136 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 17 02:50:18.293154 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Apr 17 02:50:18.293167 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Apr 17 02:50:18.293181 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Apr 17 02:50:18.293193 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Apr 17 02:50:18.293206 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 17 02:50:18.293216 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 17 02:50:18.293230 systemd[1]: Reached target paths.target - Path Units. Apr 17 02:50:18.293242 systemd[1]: Reached target slices.target - Slice Units. Apr 17 02:50:18.293253 systemd[1]: Reached target swap.target - Swaps. Apr 17 02:50:18.293265 systemd[1]: Reached target timers.target - Timer Units. Apr 17 02:50:18.293277 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Apr 17 02:50:18.293288 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 17 02:50:18.293300 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Apr 17 02:50:18.293312 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Apr 17 02:50:18.293324 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 17 02:50:18.293337 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 17 02:50:18.293349 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 17 02:50:18.293361 systemd[1]: Reached target sockets.target - Socket Units. Apr 17 02:50:18.293373 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Apr 17 02:50:18.293385 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 17 02:50:18.293397 systemd[1]: Finished network-cleanup.service - Network Cleanup. Apr 17 02:50:18.293409 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Apr 17 02:50:18.293421 systemd[1]: Starting systemd-fsck-usr.service... Apr 17 02:50:18.293435 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 17 02:50:18.293446 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 17 02:50:18.293458 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 02:50:18.293470 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Apr 17 02:50:18.293482 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 17 02:50:18.293496 systemd[1]: Finished systemd-fsck-usr.service. Apr 17 02:50:18.293534 systemd-journald[201]: Collecting audit messages is disabled. Apr 17 02:50:18.293563 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 17 02:50:18.293576 systemd-journald[201]: Journal started Apr 17 02:50:18.293608 systemd-journald[201]: Runtime Journal (/run/log/journal/9b38c2ade916417f9dff4a1fd2635677) is 6M, max 48.1M, 42.1M free. Apr 17 02:50:18.276032 systemd-modules-load[204]: Inserted module 'overlay' Apr 17 02:50:18.299283 systemd[1]: Started systemd-journald.service - Journal Service. Apr 17 02:50:18.300104 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 02:50:18.311593 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 17 02:50:18.347743 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 17 02:50:18.356956 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 17 02:50:18.364387 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Apr 17 02:50:18.364665 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 17 02:50:18.365613 systemd-modules-load[204]: Inserted module 'br_netfilter' Apr 17 02:50:18.366419 kernel: Bridge firewalling registered Apr 17 02:50:18.373204 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 17 02:50:18.376749 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 17 02:50:18.387129 systemd-tmpfiles[223]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Apr 17 02:50:18.391432 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 17 02:50:18.393136 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 17 02:50:18.395482 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 17 02:50:18.401242 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Apr 17 02:50:18.404471 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 17 02:50:18.412299 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 17 02:50:18.523284 dracut-cmdline[241]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=f73cf1d40ab12c6181d739932b2133dbe986804f7665fccb580a411e6eed38d9 Apr 17 02:50:18.533628 systemd-resolved[243]: Positive Trust Anchors: Apr 17 02:50:18.533638 systemd-resolved[243]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 17 02:50:18.533670 systemd-resolved[243]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 17 02:50:18.536449 systemd-resolved[243]: Defaulting to hostname 'linux'. Apr 17 02:50:18.537329 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 17 02:50:18.540453 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 17 02:50:18.775947 kernel: SCSI subsystem initialized Apr 17 02:50:18.788936 kernel: Loading iSCSI transport class v2.0-870. Apr 17 02:50:18.838590 kernel: iscsi: registered transport (tcp) Apr 17 02:50:18.918502 kernel: iscsi: registered transport (qla4xxx) Apr 17 02:50:18.918596 kernel: QLogic iSCSI HBA Driver Apr 17 02:50:18.960155 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Apr 17 02:50:18.990811 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Apr 17 02:50:19.001562 systemd[1]: Reached target network-pre.target - Preparation for Network. Apr 17 02:50:19.168886 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Apr 17 02:50:19.174604 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Apr 17 02:50:19.348570 kernel: raid6: avx512x4 gen() 35216 MB/s Apr 17 02:50:19.365019 kernel: raid6: avx512x2 gen() 35715 MB/s Apr 17 02:50:19.384029 kernel: raid6: avx512x1 gen() 30080 MB/s Apr 17 02:50:19.404543 kernel: raid6: avx2x4 gen() 29277 MB/s Apr 17 02:50:19.480800 kernel: raid6: avx2x2 gen() 18969 MB/s Apr 17 02:50:19.485625 kernel: raid6: avx2x1 gen() 48 MB/s Apr 17 02:50:19.485710 kernel: raid6: using algorithm avx512x2 gen() 35715 MB/s Apr 17 02:50:19.504657 kernel: raid6: .... xor() 18807 MB/s, rmw enabled Apr 17 02:50:19.504802 kernel: raid6: using avx512x2 recovery algorithm Apr 17 02:50:19.536905 kernel: xor: automatically using best checksumming function avx Apr 17 02:50:19.804920 kernel: Btrfs loaded, zoned=no, fsverity=no Apr 17 02:50:19.869319 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Apr 17 02:50:19.874174 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 17 02:50:19.914764 systemd-udevd[453]: Using default interface naming scheme 'v255'. Apr 17 02:50:19.943477 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 17 02:50:19.951210 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Apr 17 02:50:19.984309 dracut-pre-trigger[462]: rd.md=0: removing MD RAID activation Apr 17 02:50:20.034452 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Apr 17 02:50:20.036239 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 17 02:50:20.089743 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 17 02:50:20.092250 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Apr 17 02:50:20.183123 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Apr 17 02:50:20.187812 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Apr 17 02:50:20.189851 kernel: cryptd: max_cpu_qlen set to 1000 Apr 17 02:50:20.195210 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Apr 17 02:50:20.195250 kernel: GPT:9289727 != 19775487 Apr 17 02:50:20.195260 kernel: GPT:Alternate GPT header not at the end of the disk. Apr 17 02:50:20.197844 kernel: GPT:9289727 != 19775487 Apr 17 02:50:20.197890 kernel: GPT: Use GNU Parted to correct GPT errors. Apr 17 02:50:20.200911 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Apr 17 02:50:20.224743 kernel: libata version 3.00 loaded. Apr 17 02:50:20.231597 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 17 02:50:20.231726 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 02:50:20.252220 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 02:50:20.260311 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 02:50:20.268930 kernel: ahci 0000:00:1f.2: version 3.0 Apr 17 02:50:20.269360 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Apr 17 02:50:20.271257 kernel: AES CTR mode by8 optimization enabled Apr 17 02:50:20.271295 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Apr 17 02:50:20.271445 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Apr 17 02:50:20.269400 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Apr 17 02:50:20.282448 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Apr 17 02:50:20.282611 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Apr 17 02:50:20.282900 kernel: scsi host0: ahci Apr 17 02:50:20.282193 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 17 02:50:20.291103 kernel: scsi host1: ahci Apr 17 02:50:20.291260 kernel: scsi host2: ahci Apr 17 02:50:20.291367 kernel: scsi host3: ahci Apr 17 02:50:20.282303 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 02:50:20.299902 kernel: scsi host4: ahci Apr 17 02:50:20.300078 kernel: scsi host5: ahci Apr 17 02:50:20.300179 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 1 Apr 17 02:50:20.291647 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 02:50:20.312446 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 1 Apr 17 02:50:20.312479 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 1 Apr 17 02:50:20.312487 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 1 Apr 17 02:50:20.312494 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 1 Apr 17 02:50:20.312501 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 1 Apr 17 02:50:20.385933 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 02:50:20.399382 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Apr 17 02:50:20.413077 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Apr 17 02:50:20.421561 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Apr 17 02:50:20.441743 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Apr 17 02:50:20.449984 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Apr 17 02:50:20.466489 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Apr 17 02:50:20.501178 disk-uuid[647]: Primary Header is updated. Apr 17 02:50:20.501178 disk-uuid[647]: Secondary Entries is updated. Apr 17 02:50:20.501178 disk-uuid[647]: Secondary Header is updated. Apr 17 02:50:20.508995 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Apr 17 02:50:20.622207 kernel: ata1: SATA link down (SStatus 0 SControl 300) Apr 17 02:50:20.625864 kernel: ata2: SATA link down (SStatus 0 SControl 300) Apr 17 02:50:20.679221 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Apr 17 02:50:20.681893 kernel: ata6: SATA link down (SStatus 0 SControl 300) Apr 17 02:50:20.683926 kernel: ata5: SATA link down (SStatus 0 SControl 300) Apr 17 02:50:20.685906 kernel: ata4: SATA link down (SStatus 0 SControl 300) Apr 17 02:50:20.685957 kernel: ata3.00: LPM support broken, forcing max_power Apr 17 02:50:20.688840 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Apr 17 02:50:20.688875 kernel: ata3.00: applying bridge limits Apr 17 02:50:20.692115 kernel: ata3.00: LPM support broken, forcing max_power Apr 17 02:50:20.692160 kernel: ata3.00: configured for UDMA/100 Apr 17 02:50:20.693892 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Apr 17 02:50:20.797688 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Apr 17 02:50:20.798242 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Apr 17 02:50:20.813021 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Apr 17 02:50:21.202929 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Apr 17 02:50:21.209338 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Apr 17 02:50:21.217520 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 17 02:50:21.221083 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 17 02:50:21.237399 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Apr 17 02:50:21.283658 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Apr 17 02:50:21.556870 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Apr 17 02:50:21.557164 disk-uuid[648]: The operation has completed successfully. Apr 17 02:50:21.609122 systemd[1]: disk-uuid.service: Deactivated successfully. Apr 17 02:50:21.610134 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Apr 17 02:50:21.680850 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Apr 17 02:50:21.700971 sh[677]: Success Apr 17 02:50:21.761631 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Apr 17 02:50:21.761714 kernel: device-mapper: uevent: version 1.0.3 Apr 17 02:50:21.763647 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Apr 17 02:50:21.776844 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Apr 17 02:50:21.819474 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Apr 17 02:50:21.837312 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Apr 17 02:50:21.842608 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Apr 17 02:50:21.861775 kernel: BTRFS: device fsid d1542dca-1171-4bcf-9aae-d85dd05fe503 devid 1 transid 32 /dev/mapper/usr (253:0) scanned by mount (689) Apr 17 02:50:21.861872 kernel: BTRFS info (device dm-0): first mount of filesystem d1542dca-1171-4bcf-9aae-d85dd05fe503 Apr 17 02:50:21.861886 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Apr 17 02:50:21.902851 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Apr 17 02:50:21.902967 kernel: BTRFS info (device dm-0 state E): enabling free space tree Apr 17 02:50:21.906599 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Apr 17 02:50:21.907524 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Apr 17 02:50:21.947169 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Apr 17 02:50:21.948900 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Apr 17 02:50:21.953596 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Apr 17 02:50:22.000907 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (722) Apr 17 02:50:22.009031 kernel: BTRFS info (device vda6): first mount of filesystem aa52e89c-0ed3-4175-9a87-dc7b421a671a Apr 17 02:50:22.009106 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Apr 17 02:50:22.019429 kernel: BTRFS info (device vda6): turning on async discard Apr 17 02:50:22.019500 kernel: BTRFS info (device vda6): enabling free space tree Apr 17 02:50:22.031885 kernel: BTRFS info (device vda6): last unmount of filesystem aa52e89c-0ed3-4175-9a87-dc7b421a671a Apr 17 02:50:22.035998 systemd[1]: Finished ignition-setup.service - Ignition (setup). Apr 17 02:50:22.040669 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Apr 17 02:50:22.206429 ignition[775]: Ignition 2.22.0 Apr 17 02:50:22.206442 ignition[775]: Stage: fetch-offline Apr 17 02:50:22.206477 ignition[775]: no configs at "/usr/lib/ignition/base.d" Apr 17 02:50:22.209541 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 17 02:50:22.206485 ignition[775]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Apr 17 02:50:22.206592 ignition[775]: parsed url from cmdline: "" Apr 17 02:50:22.206595 ignition[775]: no config URL provided Apr 17 02:50:22.236256 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 17 02:50:22.206600 ignition[775]: reading system config file "/usr/lib/ignition/user.ign" Apr 17 02:50:22.206606 ignition[775]: no config at "/usr/lib/ignition/user.ign" Apr 17 02:50:22.206629 ignition[775]: op(1): [started] loading QEMU firmware config module Apr 17 02:50:22.206633 ignition[775]: op(1): executing: "modprobe" "qemu_fw_cfg" Apr 17 02:50:22.234013 ignition[775]: op(1): [finished] loading QEMU firmware config module Apr 17 02:50:22.297665 systemd-networkd[866]: lo: Link UP Apr 17 02:50:22.298421 systemd-networkd[866]: lo: Gained carrier Apr 17 02:50:22.299937 systemd-networkd[866]: Enumeration completed Apr 17 02:50:22.300294 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 17 02:50:22.300318 systemd-networkd[866]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 17 02:50:22.300322 systemd-networkd[866]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 17 02:50:22.300671 systemd-networkd[866]: eth0: Link UP Apr 17 02:50:22.301504 systemd-networkd[866]: eth0: Gained carrier Apr 17 02:50:22.301512 systemd-networkd[866]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 17 02:50:22.304282 systemd[1]: Reached target network.target - Network. Apr 17 02:50:22.358439 systemd-networkd[866]: eth0: DHCPv4 address 10.0.0.21/16, gateway 10.0.0.1 acquired from 10.0.0.1 Apr 17 02:50:22.455116 ignition[775]: parsing config with SHA512: 9d44bb665ec6f43cbc61e3aa11db277620298461eeba5a96854a073c4c6ef81a5492dbc6c5205a88733944896835a5572004244907fadcfecc3ba85fa4d54464 Apr 17 02:50:22.482617 unknown[775]: fetched base config from "system" Apr 17 02:50:22.483443 unknown[775]: fetched user config from "qemu" Apr 17 02:50:22.484526 ignition[775]: fetch-offline: fetch-offline passed Apr 17 02:50:22.484607 ignition[775]: Ignition finished successfully Apr 17 02:50:22.493725 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Apr 17 02:50:22.494397 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Apr 17 02:50:22.496132 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Apr 17 02:50:22.595207 ignition[871]: Ignition 2.22.0 Apr 17 02:50:22.595231 ignition[871]: Stage: kargs Apr 17 02:50:22.595352 ignition[871]: no configs at "/usr/lib/ignition/base.d" Apr 17 02:50:22.595358 ignition[871]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Apr 17 02:50:22.596539 ignition[871]: kargs: kargs passed Apr 17 02:50:22.596580 ignition[871]: Ignition finished successfully Apr 17 02:50:22.605314 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Apr 17 02:50:22.611078 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Apr 17 02:50:22.676050 ignition[879]: Ignition 2.22.0 Apr 17 02:50:22.676436 ignition[879]: Stage: disks Apr 17 02:50:22.678081 ignition[879]: no configs at "/usr/lib/ignition/base.d" Apr 17 02:50:22.678093 ignition[879]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Apr 17 02:50:22.682356 systemd[1]: Finished ignition-disks.service - Ignition (disks). Apr 17 02:50:22.679184 ignition[879]: disks: disks passed Apr 17 02:50:22.683650 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Apr 17 02:50:22.679253 ignition[879]: Ignition finished successfully Apr 17 02:50:22.691082 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Apr 17 02:50:22.691446 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 17 02:50:22.697573 systemd[1]: Reached target sysinit.target - System Initialization. Apr 17 02:50:22.701633 systemd[1]: Reached target basic.target - Basic System. Apr 17 02:50:22.705905 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Apr 17 02:50:22.770737 systemd-fsck[889]: ROOT: clean, 15/553520 files, 52789/553472 blocks Apr 17 02:50:22.778459 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Apr 17 02:50:22.788940 systemd[1]: Mounting sysroot.mount - /sysroot... Apr 17 02:50:22.983875 kernel: EXT4-fs (vda9): mounted filesystem ee420a69-62b9-42f4-84c7-ea3f2d87c569 r/w with ordered data mode. Quota mode: none. Apr 17 02:50:22.984717 systemd[1]: Mounted sysroot.mount - /sysroot. Apr 17 02:50:22.985330 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Apr 17 02:50:22.990736 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 17 02:50:23.014240 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Apr 17 02:50:23.025963 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (897) Apr 17 02:50:23.016906 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Apr 17 02:50:23.033484 kernel: BTRFS info (device vda6): first mount of filesystem aa52e89c-0ed3-4175-9a87-dc7b421a671a Apr 17 02:50:23.033515 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Apr 17 02:50:23.016970 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Apr 17 02:50:23.017000 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Apr 17 02:50:23.046614 kernel: BTRFS info (device vda6): turning on async discard Apr 17 02:50:23.046642 kernel: BTRFS info (device vda6): enabling free space tree Apr 17 02:50:23.028082 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Apr 17 02:50:23.038175 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Apr 17 02:50:23.064033 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 17 02:50:23.156309 initrd-setup-root[921]: cut: /sysroot/etc/passwd: No such file or directory Apr 17 02:50:23.164899 initrd-setup-root[928]: cut: /sysroot/etc/group: No such file or directory Apr 17 02:50:23.168076 initrd-setup-root[935]: cut: /sysroot/etc/shadow: No such file or directory Apr 17 02:50:23.173763 initrd-setup-root[942]: cut: /sysroot/etc/gshadow: No such file or directory Apr 17 02:50:23.384371 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Apr 17 02:50:23.391440 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Apr 17 02:50:23.406623 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Apr 17 02:50:23.452304 systemd[1]: sysroot-oem.mount: Deactivated successfully. Apr 17 02:50:23.455046 kernel: BTRFS info (device vda6): last unmount of filesystem aa52e89c-0ed3-4175-9a87-dc7b421a671a Apr 17 02:50:23.484142 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Apr 17 02:50:23.497766 ignition[1010]: INFO : Ignition 2.22.0 Apr 17 02:50:23.497766 ignition[1010]: INFO : Stage: mount Apr 17 02:50:23.501704 ignition[1010]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 17 02:50:23.501704 ignition[1010]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Apr 17 02:50:23.508915 ignition[1010]: INFO : mount: mount passed Apr 17 02:50:23.508915 ignition[1010]: INFO : Ignition finished successfully Apr 17 02:50:23.513779 systemd[1]: Finished ignition-mount.service - Ignition (mount). Apr 17 02:50:23.556449 systemd[1]: Starting ignition-files.service - Ignition (files)... Apr 17 02:50:23.987990 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 17 02:50:24.008875 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1024) Apr 17 02:50:24.008923 kernel: BTRFS info (device vda6): first mount of filesystem aa52e89c-0ed3-4175-9a87-dc7b421a671a Apr 17 02:50:24.013207 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Apr 17 02:50:24.022307 kernel: BTRFS info (device vda6): turning on async discard Apr 17 02:50:24.022519 kernel: BTRFS info (device vda6): enabling free space tree Apr 17 02:50:24.025114 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 17 02:50:24.063876 ignition[1041]: INFO : Ignition 2.22.0 Apr 17 02:50:24.063876 ignition[1041]: INFO : Stage: files Apr 17 02:50:24.068865 ignition[1041]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 17 02:50:24.068865 ignition[1041]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Apr 17 02:50:24.068865 ignition[1041]: DEBUG : files: compiled without relabeling support, skipping Apr 17 02:50:24.068865 ignition[1041]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Apr 17 02:50:24.068865 ignition[1041]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Apr 17 02:50:24.090896 ignition[1041]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Apr 17 02:50:24.090896 ignition[1041]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Apr 17 02:50:24.090896 ignition[1041]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Apr 17 02:50:24.090896 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Apr 17 02:50:24.090896 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Apr 17 02:50:24.083484 unknown[1041]: wrote ssh authorized keys file for user: core Apr 17 02:50:24.167312 systemd-networkd[866]: eth0: Gained IPv6LL Apr 17 02:50:24.247124 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Apr 17 02:50:24.462863 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Apr 17 02:50:24.468313 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-x86-64.raw: attempt #1 Apr 17 02:50:24.698887 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Apr 17 02:50:25.238788 ignition[1041]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Apr 17 02:50:25.238788 ignition[1041]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Apr 17 02:50:25.247869 ignition[1041]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 17 02:50:25.247869 ignition[1041]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 17 02:50:25.247869 ignition[1041]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Apr 17 02:50:25.247869 ignition[1041]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Apr 17 02:50:25.247869 ignition[1041]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Apr 17 02:50:25.247869 ignition[1041]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Apr 17 02:50:25.247869 ignition[1041]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Apr 17 02:50:25.247869 ignition[1041]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Apr 17 02:50:25.352753 ignition[1041]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Apr 17 02:50:25.358025 ignition[1041]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Apr 17 02:50:25.360742 ignition[1041]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Apr 17 02:50:25.360742 ignition[1041]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Apr 17 02:50:25.360742 ignition[1041]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Apr 17 02:50:25.360742 ignition[1041]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Apr 17 02:50:25.360742 ignition[1041]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Apr 17 02:50:25.360742 ignition[1041]: INFO : files: files passed Apr 17 02:50:25.360742 ignition[1041]: INFO : Ignition finished successfully Apr 17 02:50:25.380156 systemd[1]: Finished ignition-files.service - Ignition (files). Apr 17 02:50:25.386759 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Apr 17 02:50:25.393595 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Apr 17 02:50:25.440985 systemd[1]: ignition-quench.service: Deactivated successfully. Apr 17 02:50:25.441120 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Apr 17 02:50:25.453915 initrd-setup-root-after-ignition[1070]: grep: /sysroot/oem/oem-release: No such file or directory Apr 17 02:50:25.457101 initrd-setup-root-after-ignition[1072]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 17 02:50:25.457101 initrd-setup-root-after-ignition[1072]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Apr 17 02:50:25.466681 initrd-setup-root-after-ignition[1076]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 17 02:50:25.457202 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 17 02:50:25.475272 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Apr 17 02:50:25.480448 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Apr 17 02:50:25.600055 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Apr 17 02:50:25.600349 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Apr 17 02:50:25.604559 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Apr 17 02:50:25.649404 systemd[1]: Reached target initrd.target - Initrd Default Target. Apr 17 02:50:25.652149 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Apr 17 02:50:25.653420 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Apr 17 02:50:25.688572 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 17 02:50:25.690847 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Apr 17 02:50:25.711565 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Apr 17 02:50:25.744348 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 17 02:50:25.747539 systemd[1]: Stopped target timers.target - Timer Units. Apr 17 02:50:25.755023 kernel: hrtimer: interrupt took 11821060 ns Apr 17 02:50:25.753366 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Apr 17 02:50:25.753628 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 17 02:50:25.765554 systemd[1]: Stopped target initrd.target - Initrd Default Target. Apr 17 02:50:25.768503 systemd[1]: Stopped target basic.target - Basic System. Apr 17 02:50:25.770751 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Apr 17 02:50:25.784335 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Apr 17 02:50:25.787177 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Apr 17 02:50:25.789999 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Apr 17 02:50:25.794462 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Apr 17 02:50:25.802682 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Apr 17 02:50:25.808612 systemd[1]: Stopped target sysinit.target - System Initialization. Apr 17 02:50:25.848991 systemd[1]: Stopped target local-fs.target - Local File Systems. Apr 17 02:50:25.856178 systemd[1]: Stopped target swap.target - Swaps. Apr 17 02:50:25.858709 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Apr 17 02:50:25.858959 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Apr 17 02:50:25.865294 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Apr 17 02:50:25.870095 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 17 02:50:25.873656 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Apr 17 02:50:25.873794 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 17 02:50:25.880279 systemd[1]: dracut-initqueue.service: Deactivated successfully. Apr 17 02:50:25.880425 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Apr 17 02:50:25.891674 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Apr 17 02:50:25.892140 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Apr 17 02:50:25.898249 systemd[1]: Stopped target paths.target - Path Units. Apr 17 02:50:25.900243 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Apr 17 02:50:25.904117 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 17 02:50:25.905150 systemd[1]: Stopped target slices.target - Slice Units. Apr 17 02:50:25.909572 systemd[1]: Stopped target sockets.target - Socket Units. Apr 17 02:50:25.945556 systemd[1]: iscsid.socket: Deactivated successfully. Apr 17 02:50:25.945666 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Apr 17 02:50:25.952993 systemd[1]: iscsiuio.socket: Deactivated successfully. Apr 17 02:50:25.953091 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 17 02:50:25.956592 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Apr 17 02:50:25.956761 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 17 02:50:25.969240 systemd[1]: ignition-files.service: Deactivated successfully. Apr 17 02:50:25.969381 systemd[1]: Stopped ignition-files.service - Ignition (files). Apr 17 02:50:25.973975 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Apr 17 02:50:25.979965 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Apr 17 02:50:25.985520 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Apr 17 02:50:25.987899 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Apr 17 02:50:25.990604 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Apr 17 02:50:25.990703 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Apr 17 02:50:26.008240 systemd[1]: initrd-cleanup.service: Deactivated successfully. Apr 17 02:50:26.008331 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Apr 17 02:50:26.014883 systemd[1]: sysroot-boot.mount: Deactivated successfully. Apr 17 02:50:26.054434 ignition[1097]: INFO : Ignition 2.22.0 Apr 17 02:50:26.054434 ignition[1097]: INFO : Stage: umount Apr 17 02:50:26.058230 ignition[1097]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 17 02:50:26.058230 ignition[1097]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Apr 17 02:50:26.058230 ignition[1097]: INFO : umount: umount passed Apr 17 02:50:26.058230 ignition[1097]: INFO : Ignition finished successfully Apr 17 02:50:26.068172 systemd[1]: ignition-mount.service: Deactivated successfully. Apr 17 02:50:26.068270 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Apr 17 02:50:26.074640 systemd[1]: Stopped target network.target - Network. Apr 17 02:50:26.078500 systemd[1]: ignition-disks.service: Deactivated successfully. Apr 17 02:50:26.078572 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Apr 17 02:50:26.086536 systemd[1]: ignition-kargs.service: Deactivated successfully. Apr 17 02:50:26.086641 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Apr 17 02:50:26.091514 systemd[1]: ignition-setup.service: Deactivated successfully. Apr 17 02:50:26.091589 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Apr 17 02:50:26.094189 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Apr 17 02:50:26.094253 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Apr 17 02:50:26.101115 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Apr 17 02:50:26.102795 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Apr 17 02:50:26.112068 systemd[1]: systemd-resolved.service: Deactivated successfully. Apr 17 02:50:26.112187 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Apr 17 02:50:26.158983 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Apr 17 02:50:26.159310 systemd[1]: systemd-networkd.service: Deactivated successfully. Apr 17 02:50:26.159434 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Apr 17 02:50:26.167470 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Apr 17 02:50:26.167731 systemd[1]: sysroot-boot.service: Deactivated successfully. Apr 17 02:50:26.168000 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Apr 17 02:50:26.174300 systemd[1]: Stopped target network-pre.target - Preparation for Network. Apr 17 02:50:26.177150 systemd[1]: systemd-networkd.socket: Deactivated successfully. Apr 17 02:50:26.177195 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Apr 17 02:50:26.182374 systemd[1]: initrd-setup-root.service: Deactivated successfully. Apr 17 02:50:26.182438 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Apr 17 02:50:26.200982 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Apr 17 02:50:26.206248 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Apr 17 02:50:26.206355 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 17 02:50:26.209337 systemd[1]: systemd-sysctl.service: Deactivated successfully. Apr 17 02:50:26.209393 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Apr 17 02:50:26.214354 systemd[1]: systemd-modules-load.service: Deactivated successfully. Apr 17 02:50:26.214415 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Apr 17 02:50:26.215911 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Apr 17 02:50:26.215958 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 17 02:50:26.224706 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 17 02:50:26.229709 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Apr 17 02:50:26.229791 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Apr 17 02:50:26.250802 systemd[1]: network-cleanup.service: Deactivated successfully. Apr 17 02:50:26.251126 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Apr 17 02:50:26.257286 systemd[1]: systemd-udevd.service: Deactivated successfully. Apr 17 02:50:26.257465 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 17 02:50:26.277957 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Apr 17 02:50:26.278091 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Apr 17 02:50:26.280272 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Apr 17 02:50:26.280316 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Apr 17 02:50:26.288182 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Apr 17 02:50:26.288253 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Apr 17 02:50:26.298251 systemd[1]: dracut-cmdline.service: Deactivated successfully. Apr 17 02:50:26.298321 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Apr 17 02:50:26.299297 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 17 02:50:26.299360 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 17 02:50:26.314604 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Apr 17 02:50:26.341407 systemd[1]: systemd-network-generator.service: Deactivated successfully. Apr 17 02:50:26.341496 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Apr 17 02:50:26.348564 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Apr 17 02:50:26.348636 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 17 02:50:26.356401 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Apr 17 02:50:26.356471 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 17 02:50:26.364280 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Apr 17 02:50:26.364344 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Apr 17 02:50:26.369274 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 17 02:50:26.369341 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 02:50:26.378671 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Apr 17 02:50:26.378746 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Apr 17 02:50:26.378779 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Apr 17 02:50:26.378899 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Apr 17 02:50:26.379402 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Apr 17 02:50:26.379715 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Apr 17 02:50:26.386108 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Apr 17 02:50:26.400494 systemd[1]: Starting initrd-switch-root.service - Switch Root... Apr 17 02:50:26.461588 systemd[1]: Switching root. Apr 17 02:50:26.496757 systemd-journald[201]: Journal stopped Apr 17 02:50:28.289443 systemd-journald[201]: Received SIGTERM from PID 1 (systemd). Apr 17 02:50:28.289515 kernel: SELinux: policy capability network_peer_controls=1 Apr 17 02:50:28.289533 kernel: SELinux: policy capability open_perms=1 Apr 17 02:50:28.289546 kernel: SELinux: policy capability extended_socket_class=1 Apr 17 02:50:28.289558 kernel: SELinux: policy capability always_check_network=0 Apr 17 02:50:28.289571 kernel: SELinux: policy capability cgroup_seclabel=1 Apr 17 02:50:28.289588 kernel: SELinux: policy capability nnp_nosuid_transition=1 Apr 17 02:50:28.289603 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Apr 17 02:50:28.289616 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Apr 17 02:50:28.289632 kernel: SELinux: policy capability userspace_initial_context=0 Apr 17 02:50:28.289645 kernel: audit: type=1403 audit(1776394226.777:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Apr 17 02:50:28.289660 systemd[1]: Successfully loaded SELinux policy in 90.816ms. Apr 17 02:50:28.289683 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.136ms. Apr 17 02:50:28.289696 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Apr 17 02:50:28.289709 systemd[1]: Detected virtualization kvm. Apr 17 02:50:28.289722 systemd[1]: Detected architecture x86-64. Apr 17 02:50:28.289735 systemd[1]: Detected first boot. Apr 17 02:50:28.289749 systemd[1]: Initializing machine ID from VM UUID. Apr 17 02:50:28.289763 zram_generator::config[1143]: No configuration found. Apr 17 02:50:28.289777 kernel: Guest personality initialized and is inactive Apr 17 02:50:28.289789 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Apr 17 02:50:28.289802 kernel: Initialized host personality Apr 17 02:50:28.289965 kernel: NET: Registered PF_VSOCK protocol family Apr 17 02:50:28.289991 systemd[1]: Populated /etc with preset unit settings. Apr 17 02:50:28.290007 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Apr 17 02:50:28.290019 systemd[1]: initrd-switch-root.service: Deactivated successfully. Apr 17 02:50:28.290031 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Apr 17 02:50:28.290045 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Apr 17 02:50:28.290059 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Apr 17 02:50:28.290073 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Apr 17 02:50:28.290085 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Apr 17 02:50:28.290098 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Apr 17 02:50:28.290113 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Apr 17 02:50:28.290128 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Apr 17 02:50:28.290141 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Apr 17 02:50:28.290157 systemd[1]: Created slice user.slice - User and Session Slice. Apr 17 02:50:28.290170 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 17 02:50:28.290183 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 17 02:50:28.290195 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Apr 17 02:50:28.290208 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Apr 17 02:50:28.290222 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Apr 17 02:50:28.290239 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 17 02:50:28.290252 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Apr 17 02:50:28.290265 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 17 02:50:28.290278 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 17 02:50:28.290290 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Apr 17 02:50:28.290302 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Apr 17 02:50:28.290314 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Apr 17 02:50:28.290328 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Apr 17 02:50:28.290342 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 17 02:50:28.290355 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 17 02:50:28.290369 systemd[1]: Reached target slices.target - Slice Units. Apr 17 02:50:28.290383 systemd[1]: Reached target swap.target - Swaps. Apr 17 02:50:28.290397 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Apr 17 02:50:28.290410 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Apr 17 02:50:28.290423 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Apr 17 02:50:28.290434 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 17 02:50:28.290447 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 17 02:50:28.290458 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 17 02:50:28.290472 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Apr 17 02:50:28.290483 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Apr 17 02:50:28.290495 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Apr 17 02:50:28.290507 systemd[1]: Mounting media.mount - External Media Directory... Apr 17 02:50:28.290519 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 02:50:28.290532 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Apr 17 02:50:28.290544 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Apr 17 02:50:28.290556 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Apr 17 02:50:28.290573 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Apr 17 02:50:28.290585 systemd[1]: Reached target machines.target - Containers. Apr 17 02:50:28.290598 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Apr 17 02:50:28.290611 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 17 02:50:28.290624 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 17 02:50:28.290636 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Apr 17 02:50:28.290651 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 17 02:50:28.290663 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 17 02:50:28.290677 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 17 02:50:28.290692 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Apr 17 02:50:28.290705 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 17 02:50:28.290719 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Apr 17 02:50:28.290730 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Apr 17 02:50:28.290742 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Apr 17 02:50:28.290753 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Apr 17 02:50:28.290765 systemd[1]: Stopped systemd-fsck-usr.service. Apr 17 02:50:28.290779 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Apr 17 02:50:28.290793 kernel: loop: module loaded Apr 17 02:50:28.290805 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 17 02:50:28.290932 kernel: fuse: init (API version 7.41) Apr 17 02:50:28.290957 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 17 02:50:28.290971 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Apr 17 02:50:28.290983 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Apr 17 02:50:28.290997 kernel: ACPI: bus type drm_connector registered Apr 17 02:50:28.291009 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Apr 17 02:50:28.291055 systemd-journald[1228]: Collecting audit messages is disabled. Apr 17 02:50:28.291085 systemd-journald[1228]: Journal started Apr 17 02:50:28.291113 systemd-journald[1228]: Runtime Journal (/run/log/journal/9b38c2ade916417f9dff4a1fd2635677) is 6M, max 48.1M, 42.1M free. Apr 17 02:50:27.681473 systemd[1]: Queued start job for default target multi-user.target. Apr 17 02:50:27.712571 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Apr 17 02:50:27.714591 systemd[1]: systemd-journald.service: Deactivated successfully. Apr 17 02:50:28.299546 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 17 02:50:28.305878 systemd[1]: verity-setup.service: Deactivated successfully. Apr 17 02:50:28.306034 systemd[1]: Stopped verity-setup.service. Apr 17 02:50:28.313275 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 02:50:28.366890 systemd[1]: Started systemd-journald.service - Journal Service. Apr 17 02:50:28.373996 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Apr 17 02:50:28.377303 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Apr 17 02:50:28.381126 systemd[1]: Mounted media.mount - External Media Directory. Apr 17 02:50:28.383632 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Apr 17 02:50:28.386420 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Apr 17 02:50:28.389066 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Apr 17 02:50:28.391513 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Apr 17 02:50:28.394888 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 17 02:50:28.398185 systemd[1]: modprobe@configfs.service: Deactivated successfully. Apr 17 02:50:28.398632 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Apr 17 02:50:28.408745 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 17 02:50:28.409002 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 17 02:50:28.411607 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 17 02:50:28.411871 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 17 02:50:28.415594 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 17 02:50:28.416794 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 17 02:50:28.423095 systemd[1]: modprobe@fuse.service: Deactivated successfully. Apr 17 02:50:28.423474 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Apr 17 02:50:28.426464 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 17 02:50:28.426706 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 17 02:50:28.429435 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 17 02:50:28.432439 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Apr 17 02:50:28.436184 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Apr 17 02:50:28.439733 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Apr 17 02:50:28.443172 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 17 02:50:28.456319 systemd[1]: Reached target network-pre.target - Preparation for Network. Apr 17 02:50:28.460364 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Apr 17 02:50:28.463650 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Apr 17 02:50:28.466161 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Apr 17 02:50:28.466257 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 17 02:50:28.471187 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Apr 17 02:50:28.482280 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Apr 17 02:50:28.485135 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 17 02:50:28.488298 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Apr 17 02:50:28.492581 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Apr 17 02:50:28.500991 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 17 02:50:28.506263 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Apr 17 02:50:28.508978 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 17 02:50:28.556891 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 17 02:50:28.567608 systemd-journald[1228]: Time spent on flushing to /var/log/journal/9b38c2ade916417f9dff4a1fd2635677 is 18.738ms for 1073 entries. Apr 17 02:50:28.567608 systemd-journald[1228]: System Journal (/var/log/journal/9b38c2ade916417f9dff4a1fd2635677) is 8M, max 195.6M, 187.6M free. Apr 17 02:50:28.621564 systemd-journald[1228]: Received client request to flush runtime journal. Apr 17 02:50:28.621596 kernel: loop0: detected capacity change from 0 to 219192 Apr 17 02:50:28.561265 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Apr 17 02:50:28.570512 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 17 02:50:28.583690 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Apr 17 02:50:28.589241 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Apr 17 02:50:28.592886 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Apr 17 02:50:28.602684 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Apr 17 02:50:28.608022 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Apr 17 02:50:28.617302 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 17 02:50:28.624099 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Apr 17 02:50:28.643215 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Apr 17 02:50:28.645565 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Apr 17 02:50:28.653853 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Apr 17 02:50:28.654932 systemd-tmpfiles[1264]: ACLs are not supported, ignoring. Apr 17 02:50:28.654947 systemd-tmpfiles[1264]: ACLs are not supported, ignoring. Apr 17 02:50:28.659864 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 17 02:50:28.664624 systemd[1]: Starting systemd-sysusers.service - Create System Users... Apr 17 02:50:28.686093 kernel: loop1: detected capacity change from 0 to 128560 Apr 17 02:50:28.788467 systemd[1]: Finished systemd-sysusers.service - Create System Users. Apr 17 02:50:28.793244 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 17 02:50:28.806892 kernel: loop2: detected capacity change from 0 to 110984 Apr 17 02:50:28.830654 systemd-tmpfiles[1286]: ACLs are not supported, ignoring. Apr 17 02:50:28.830794 systemd-tmpfiles[1286]: ACLs are not supported, ignoring. Apr 17 02:50:28.838413 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 17 02:50:28.870687 kernel: loop3: detected capacity change from 0 to 219192 Apr 17 02:50:28.890510 kernel: loop4: detected capacity change from 0 to 128560 Apr 17 02:50:28.911277 kernel: loop5: detected capacity change from 0 to 110984 Apr 17 02:50:28.981542 (sd-merge)[1290]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Apr 17 02:50:28.982060 (sd-merge)[1290]: Merged extensions into '/usr'. Apr 17 02:50:28.987472 systemd[1]: Reload requested from client PID 1263 ('systemd-sysext') (unit systemd-sysext.service)... Apr 17 02:50:28.987505 systemd[1]: Reloading... Apr 17 02:50:29.048888 zram_generator::config[1313]: No configuration found. Apr 17 02:50:29.395407 ldconfig[1258]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Apr 17 02:50:29.440635 systemd[1]: Reloading finished in 452 ms. Apr 17 02:50:29.465170 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Apr 17 02:50:29.468807 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Apr 17 02:50:29.487149 systemd[1]: Starting ensure-sysext.service... Apr 17 02:50:29.491645 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 17 02:50:29.543145 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Apr 17 02:50:29.548702 systemd-tmpfiles[1354]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Apr 17 02:50:29.548758 systemd-tmpfiles[1354]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Apr 17 02:50:29.549039 systemd-tmpfiles[1354]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Apr 17 02:50:29.549266 systemd-tmpfiles[1354]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Apr 17 02:50:29.551115 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 17 02:50:29.551934 systemd-tmpfiles[1354]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Apr 17 02:50:29.552248 systemd-tmpfiles[1354]: ACLs are not supported, ignoring. Apr 17 02:50:29.552295 systemd-tmpfiles[1354]: ACLs are not supported, ignoring. Apr 17 02:50:29.554854 systemd[1]: Reload requested from client PID 1353 ('systemctl') (unit ensure-sysext.service)... Apr 17 02:50:29.554868 systemd[1]: Reloading... Apr 17 02:50:29.555695 systemd-tmpfiles[1354]: Detected autofs mount point /boot during canonicalization of boot. Apr 17 02:50:29.555705 systemd-tmpfiles[1354]: Skipping /boot Apr 17 02:50:29.560715 systemd-tmpfiles[1354]: Detected autofs mount point /boot during canonicalization of boot. Apr 17 02:50:29.560741 systemd-tmpfiles[1354]: Skipping /boot Apr 17 02:50:29.605162 systemd-udevd[1357]: Using default interface naming scheme 'v255'. Apr 17 02:50:29.611899 zram_generator::config[1381]: No configuration found. Apr 17 02:50:29.873420 kernel: mousedev: PS/2 mouse device common for all mice Apr 17 02:50:29.873515 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Apr 17 02:50:29.903246 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Apr 17 02:50:29.910043 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Apr 17 02:50:29.910254 kernel: ACPI: button: Power Button [PWRF] Apr 17 02:50:29.910276 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Apr 17 02:50:30.020774 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Apr 17 02:50:30.021229 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Apr 17 02:50:30.024285 systemd[1]: Reloading finished in 469 ms. Apr 17 02:50:30.036894 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 17 02:50:30.040135 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 17 02:50:30.103351 systemd[1]: Starting audit-rules.service - Load Audit Rules... Apr 17 02:50:30.177931 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Apr 17 02:50:30.203285 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Apr 17 02:50:30.208036 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Apr 17 02:50:30.214380 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 17 02:50:30.221341 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 17 02:50:30.222788 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Apr 17 02:50:30.226473 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 02:50:30.238490 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 02:50:30.238719 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 17 02:50:30.241792 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 17 02:50:30.243692 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 17 02:50:30.251341 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 17 02:50:30.253404 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 17 02:50:30.253767 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Apr 17 02:50:30.273206 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Apr 17 02:50:30.275221 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 02:50:30.276896 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Apr 17 02:50:30.279514 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 17 02:50:30.279963 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 17 02:50:30.285723 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 17 02:50:30.287241 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 17 02:50:30.291661 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 17 02:50:30.292146 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 17 02:50:30.313448 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Apr 17 02:50:30.334943 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 02:50:30.337791 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 17 02:50:30.345572 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 17 02:50:30.349909 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 17 02:50:30.352157 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 17 02:50:30.360221 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 17 02:50:30.360532 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Apr 17 02:50:30.366612 systemd[1]: Starting systemd-update-done.service - Update is Completed... Apr 17 02:50:30.366732 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 02:50:30.368254 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Apr 17 02:50:30.369485 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 17 02:50:30.369651 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 17 02:50:30.371202 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 17 02:50:30.371353 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 17 02:50:30.372227 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 17 02:50:30.372374 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 17 02:50:30.377561 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 17 02:50:30.383573 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 17 02:50:30.388787 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 02:50:30.389096 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 17 02:50:30.396539 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 17 02:50:30.398360 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 17 02:50:30.401180 augenrules[1521]: No rules Apr 17 02:50:30.401763 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 17 02:50:30.408204 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 17 02:50:30.408630 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 17 02:50:30.408774 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Apr 17 02:50:30.409232 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 02:50:30.410644 systemd[1]: audit-rules.service: Deactivated successfully. Apr 17 02:50:30.410955 systemd[1]: Finished audit-rules.service - Load Audit Rules. Apr 17 02:50:30.436593 systemd[1]: Finished ensure-sysext.service. Apr 17 02:50:30.437283 systemd[1]: Started systemd-userdbd.service - User Database Manager. Apr 17 02:50:30.441038 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 17 02:50:30.441240 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 17 02:50:30.442113 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 17 02:50:30.442392 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 17 02:50:30.446721 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 17 02:50:30.449172 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Apr 17 02:50:30.454456 systemd[1]: Finished systemd-update-done.service - Update is Completed. Apr 17 02:50:30.458500 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 17 02:50:30.458680 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 17 02:50:30.461506 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 17 02:50:30.461715 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 17 02:50:30.465445 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 17 02:50:30.468952 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Apr 17 02:50:30.475989 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Apr 17 02:50:30.490942 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 02:50:30.551799 systemd-networkd[1476]: lo: Link UP Apr 17 02:50:30.551981 systemd-networkd[1476]: lo: Gained carrier Apr 17 02:50:30.552259 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Apr 17 02:50:30.553456 systemd-networkd[1476]: Enumeration completed Apr 17 02:50:30.554258 systemd-networkd[1476]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 17 02:50:30.554262 systemd-networkd[1476]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 17 02:50:30.554905 systemd-networkd[1476]: eth0: Link UP Apr 17 02:50:30.555225 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 17 02:50:30.555296 systemd-networkd[1476]: eth0: Gained carrier Apr 17 02:50:30.555311 systemd-networkd[1476]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 17 02:50:30.555347 systemd-resolved[1478]: Positive Trust Anchors: Apr 17 02:50:30.555357 systemd-resolved[1478]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 17 02:50:30.555383 systemd-resolved[1478]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 17 02:50:30.557504 systemd[1]: Reached target time-set.target - System Time Set. Apr 17 02:50:30.560801 systemd-resolved[1478]: Defaulting to hostname 'linux'. Apr 17 02:50:30.561352 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Apr 17 02:50:30.566330 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Apr 17 02:50:30.569432 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 17 02:50:30.569610 systemd[1]: Reached target network.target - Network. Apr 17 02:50:30.570072 systemd-networkd[1476]: eth0: DHCPv4 address 10.0.0.21/16, gateway 10.0.0.1 acquired from 10.0.0.1 Apr 17 02:50:30.571985 systemd-timesyncd[1538]: Network configuration changed, trying to establish connection. Apr 17 02:50:31.658044 systemd-resolved[1478]: Clock change detected. Flushing caches. Apr 17 02:50:31.658071 systemd-timesyncd[1538]: Contacted time server 10.0.0.1:123 (10.0.0.1). Apr 17 02:50:31.658127 systemd-timesyncd[1538]: Initial clock synchronization to Fri 2026-04-17 02:50:31.657984 UTC. Apr 17 02:50:31.660195 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 17 02:50:31.662593 systemd[1]: Reached target sysinit.target - System Initialization. Apr 17 02:50:31.665164 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Apr 17 02:50:31.667858 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Apr 17 02:50:31.671285 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Apr 17 02:50:31.675444 systemd[1]: Started logrotate.timer - Daily rotation of log files. Apr 17 02:50:31.679144 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Apr 17 02:50:31.683493 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Apr 17 02:50:31.687512 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Apr 17 02:50:31.689496 systemd[1]: Reached target paths.target - Path Units. Apr 17 02:50:31.730330 systemd[1]: Reached target timers.target - Timer Units. Apr 17 02:50:31.741674 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Apr 17 02:50:31.746906 systemd[1]: Starting docker.socket - Docker Socket for the API... Apr 17 02:50:31.752216 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Apr 17 02:50:31.755635 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Apr 17 02:50:31.759187 systemd[1]: Reached target ssh-access.target - SSH Access Available. Apr 17 02:50:31.768098 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Apr 17 02:50:31.777449 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Apr 17 02:50:31.793563 systemd[1]: Listening on docker.socket - Docker Socket for the API. Apr 17 02:50:31.799666 systemd[1]: Reached target sockets.target - Socket Units. Apr 17 02:50:31.811130 systemd[1]: Reached target basic.target - Basic System. Apr 17 02:50:31.814251 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Apr 17 02:50:31.814537 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Apr 17 02:50:31.821179 systemd[1]: Starting containerd.service - containerd container runtime... Apr 17 02:50:31.824330 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Apr 17 02:50:31.831077 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Apr 17 02:50:31.838558 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Apr 17 02:50:31.843268 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Apr 17 02:50:31.846098 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Apr 17 02:50:31.849299 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Apr 17 02:50:31.850401 jq[1558]: false Apr 17 02:50:31.853020 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Apr 17 02:50:31.861063 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Apr 17 02:50:31.863779 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Refreshing passwd entry cache Apr 17 02:50:31.864044 extend-filesystems[1559]: Found /dev/vda6 Apr 17 02:50:31.861259 oslogin_cache_refresh[1560]: Refreshing passwd entry cache Apr 17 02:50:31.867958 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Apr 17 02:50:31.868618 extend-filesystems[1559]: Found /dev/vda9 Apr 17 02:50:31.875584 extend-filesystems[1559]: Checking size of /dev/vda9 Apr 17 02:50:31.875548 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Apr 17 02:50:31.880068 oslogin_cache_refresh[1560]: Failure getting users, quitting Apr 17 02:50:31.887302 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Failure getting users, quitting Apr 17 02:50:31.887302 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Apr 17 02:50:31.887302 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Refreshing group entry cache Apr 17 02:50:31.880092 oslogin_cache_refresh[1560]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Apr 17 02:50:31.880146 oslogin_cache_refresh[1560]: Refreshing group entry cache Apr 17 02:50:31.888658 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Failure getting groups, quitting Apr 17 02:50:31.888658 google_oslogin_nss_cache[1560]: oslogin_cache_refresh[1560]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Apr 17 02:50:31.887676 oslogin_cache_refresh[1560]: Failure getting groups, quitting Apr 17 02:50:31.888355 systemd[1]: Starting systemd-logind.service - User Login Management... Apr 17 02:50:31.887686 oslogin_cache_refresh[1560]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Apr 17 02:50:31.916277 extend-filesystems[1559]: Resized partition /dev/vda9 Apr 17 02:50:31.915294 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Apr 17 02:50:31.916053 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Apr 17 02:50:31.918835 systemd[1]: Starting update-engine.service - Update Engine... Apr 17 02:50:31.921384 extend-filesystems[1581]: resize2fs 1.47.3 (8-Jul-2025) Apr 17 02:50:31.927923 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Apr 17 02:50:31.931771 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Apr 17 02:50:31.945423 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Apr 17 02:50:31.959774 jq[1582]: true Apr 17 02:50:31.960505 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Apr 17 02:50:31.966213 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Apr 17 02:50:31.966457 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Apr 17 02:50:31.967057 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Apr 17 02:50:31.967382 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Apr 17 02:50:31.972297 systemd[1]: motdgen.service: Deactivated successfully. Apr 17 02:50:31.973235 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Apr 17 02:50:31.977617 update_engine[1580]: I20260417 02:50:31.977514 1580 main.cc:92] Flatcar Update Engine starting Apr 17 02:50:31.981326 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Apr 17 02:50:31.987425 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Apr 17 02:50:31.996774 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Apr 17 02:50:32.056236 (ntainerd)[1591]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Apr 17 02:50:32.077380 jq[1590]: true Apr 17 02:50:32.078660 systemd-logind[1579]: Watching system buttons on /dev/input/event2 (Power Button) Apr 17 02:50:32.078681 systemd-logind[1579]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Apr 17 02:50:32.080640 systemd-logind[1579]: New seat seat0. Apr 17 02:50:32.081862 extend-filesystems[1581]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Apr 17 02:50:32.081862 extend-filesystems[1581]: old_desc_blocks = 1, new_desc_blocks = 1 Apr 17 02:50:32.081862 extend-filesystems[1581]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Apr 17 02:50:32.124080 extend-filesystems[1559]: Resized filesystem in /dev/vda9 Apr 17 02:50:32.126489 sshd_keygen[1588]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Apr 17 02:50:32.084640 systemd[1]: extend-filesystems.service: Deactivated successfully. Apr 17 02:50:32.088420 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Apr 17 02:50:32.100478 systemd[1]: Started systemd-logind.service - User Login Management. Apr 17 02:50:32.131359 dbus-daemon[1556]: [system] SELinux support is enabled Apr 17 02:50:32.133878 update_engine[1580]: I20260417 02:50:32.133782 1580 update_check_scheduler.cc:74] Next update check in 8m54s Apr 17 02:50:32.134195 systemd[1]: Started dbus.service - D-Bus System Message Bus. Apr 17 02:50:32.140837 tar[1589]: linux-amd64/LICENSE Apr 17 02:50:32.141120 tar[1589]: linux-amd64/helm Apr 17 02:50:32.141506 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Apr 17 02:50:32.141667 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Apr 17 02:50:32.144549 dbus-daemon[1556]: [system] Successfully activated service 'org.freedesktop.systemd1' Apr 17 02:50:32.144926 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Apr 17 02:50:32.144970 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Apr 17 02:50:32.148057 systemd[1]: Started update-engine.service - Update Engine. Apr 17 02:50:32.157138 systemd[1]: Started locksmithd.service - Cluster reboot manager. Apr 17 02:50:32.171313 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Apr 17 02:50:32.178231 systemd[1]: Starting issuegen.service - Generate /run/issue... Apr 17 02:50:32.186685 bash[1627]: Updated "/home/core/.ssh/authorized_keys" Apr 17 02:50:32.187389 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Apr 17 02:50:32.192283 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Apr 17 02:50:32.196667 systemd[1]: issuegen.service: Deactivated successfully. Apr 17 02:50:32.196947 systemd[1]: Finished issuegen.service - Generate /run/issue. Apr 17 02:50:32.205167 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Apr 17 02:50:32.220996 locksmithd[1626]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Apr 17 02:50:32.247199 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Apr 17 02:50:32.274115 systemd[1]: Started getty@tty1.service - Getty on tty1. Apr 17 02:50:32.279546 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Apr 17 02:50:32.284404 systemd[1]: Reached target getty.target - Login Prompts. Apr 17 02:50:32.473948 containerd[1591]: time="2026-04-17T02:50:32Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Apr 17 02:50:32.480075 containerd[1591]: time="2026-04-17T02:50:32.479182403Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Apr 17 02:50:32.496947 containerd[1591]: time="2026-04-17T02:50:32.496302658Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="14.736µs" Apr 17 02:50:32.497156 containerd[1591]: time="2026-04-17T02:50:32.497085935Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Apr 17 02:50:32.497211 containerd[1591]: time="2026-04-17T02:50:32.497157737Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Apr 17 02:50:32.497643 containerd[1591]: time="2026-04-17T02:50:32.497342464Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Apr 17 02:50:32.497643 containerd[1591]: time="2026-04-17T02:50:32.497402718Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Apr 17 02:50:32.497643 containerd[1591]: time="2026-04-17T02:50:32.497437679Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Apr 17 02:50:32.497643 containerd[1591]: time="2026-04-17T02:50:32.497500998Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Apr 17 02:50:32.497643 containerd[1591]: time="2026-04-17T02:50:32.497520624Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Apr 17 02:50:32.497955 containerd[1591]: time="2026-04-17T02:50:32.497870358Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Apr 17 02:50:32.498043 containerd[1591]: time="2026-04-17T02:50:32.498004530Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Apr 17 02:50:32.498067 containerd[1591]: time="2026-04-17T02:50:32.498051914Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Apr 17 02:50:32.498091 containerd[1591]: time="2026-04-17T02:50:32.498069234Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Apr 17 02:50:32.498246 containerd[1591]: time="2026-04-17T02:50:32.498201998Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Apr 17 02:50:32.498473 containerd[1591]: time="2026-04-17T02:50:32.498426199Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Apr 17 02:50:32.498512 containerd[1591]: time="2026-04-17T02:50:32.498481064Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Apr 17 02:50:32.498512 containerd[1591]: time="2026-04-17T02:50:32.498494626Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Apr 17 02:50:32.498568 containerd[1591]: time="2026-04-17T02:50:32.498537829Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Apr 17 02:50:32.506321 containerd[1591]: time="2026-04-17T02:50:32.506264571Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Apr 17 02:50:32.506417 containerd[1591]: time="2026-04-17T02:50:32.506392332Z" level=info msg="metadata content store policy set" policy=shared Apr 17 02:50:32.522660 containerd[1591]: time="2026-04-17T02:50:32.522579864Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Apr 17 02:50:32.522878 containerd[1591]: time="2026-04-17T02:50:32.522698792Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Apr 17 02:50:32.522878 containerd[1591]: time="2026-04-17T02:50:32.522783098Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Apr 17 02:50:32.522878 containerd[1591]: time="2026-04-17T02:50:32.522800166Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Apr 17 02:50:32.522878 containerd[1591]: time="2026-04-17T02:50:32.522816134Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Apr 17 02:50:32.522878 containerd[1591]: time="2026-04-17T02:50:32.522828894Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Apr 17 02:50:32.522878 containerd[1591]: time="2026-04-17T02:50:32.522853110Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Apr 17 02:50:32.522878 containerd[1591]: time="2026-04-17T02:50:32.522867866Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Apr 17 02:50:32.523091 containerd[1591]: time="2026-04-17T02:50:32.522957066Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Apr 17 02:50:32.523091 containerd[1591]: time="2026-04-17T02:50:32.522982233Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Apr 17 02:50:32.523091 containerd[1591]: time="2026-04-17T02:50:32.522993998Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Apr 17 02:50:32.523091 containerd[1591]: time="2026-04-17T02:50:32.523009850Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Apr 17 02:50:32.523292 containerd[1591]: time="2026-04-17T02:50:32.523251795Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Apr 17 02:50:32.523347 containerd[1591]: time="2026-04-17T02:50:32.523327207Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Apr 17 02:50:32.523380 containerd[1591]: time="2026-04-17T02:50:32.523351700Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Apr 17 02:50:32.523380 containerd[1591]: time="2026-04-17T02:50:32.523363476Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Apr 17 02:50:32.523380 containerd[1591]: time="2026-04-17T02:50:32.523376645Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Apr 17 02:50:32.523457 containerd[1591]: time="2026-04-17T02:50:32.523428625Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Apr 17 02:50:32.523488 containerd[1591]: time="2026-04-17T02:50:32.523462409Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Apr 17 02:50:32.523488 containerd[1591]: time="2026-04-17T02:50:32.523482107Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Apr 17 02:50:32.523541 containerd[1591]: time="2026-04-17T02:50:32.523497695Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Apr 17 02:50:32.523541 containerd[1591]: time="2026-04-17T02:50:32.523513410Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Apr 17 02:50:32.523541 containerd[1591]: time="2026-04-17T02:50:32.523524659Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Apr 17 02:50:32.523693 containerd[1591]: time="2026-04-17T02:50:32.523607971Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Apr 17 02:50:32.523863 containerd[1591]: time="2026-04-17T02:50:32.523764381Z" level=info msg="Start snapshots syncer" Apr 17 02:50:32.523863 containerd[1591]: time="2026-04-17T02:50:32.523811388Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Apr 17 02:50:32.524281 containerd[1591]: time="2026-04-17T02:50:32.524216417Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Apr 17 02:50:32.524455 containerd[1591]: time="2026-04-17T02:50:32.524316660Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Apr 17 02:50:32.524455 containerd[1591]: time="2026-04-17T02:50:32.524422281Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Apr 17 02:50:32.524599 containerd[1591]: time="2026-04-17T02:50:32.524556050Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Apr 17 02:50:32.524638 containerd[1591]: time="2026-04-17T02:50:32.524608240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Apr 17 02:50:32.524638 containerd[1591]: time="2026-04-17T02:50:32.524624472Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Apr 17 02:50:32.524683 containerd[1591]: time="2026-04-17T02:50:32.524636700Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Apr 17 02:50:32.524683 containerd[1591]: time="2026-04-17T02:50:32.524653999Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Apr 17 02:50:32.524683 containerd[1591]: time="2026-04-17T02:50:32.524666647Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Apr 17 02:50:32.524815 containerd[1591]: time="2026-04-17T02:50:32.524681707Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Apr 17 02:50:32.524815 containerd[1591]: time="2026-04-17T02:50:32.524759119Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Apr 17 02:50:32.524815 containerd[1591]: time="2026-04-17T02:50:32.524774239Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Apr 17 02:50:32.524815 containerd[1591]: time="2026-04-17T02:50:32.524788171Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Apr 17 02:50:32.524919 containerd[1591]: time="2026-04-17T02:50:32.524843331Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Apr 17 02:50:32.524919 containerd[1591]: time="2026-04-17T02:50:32.524863844Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Apr 17 02:50:32.524919 containerd[1591]: time="2026-04-17T02:50:32.524875384Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Apr 17 02:50:32.524919 containerd[1591]: time="2026-04-17T02:50:32.524886200Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Apr 17 02:50:32.524919 containerd[1591]: time="2026-04-17T02:50:32.524894897Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Apr 17 02:50:32.524919 containerd[1591]: time="2026-04-17T02:50:32.524904078Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Apr 17 02:50:32.525097 containerd[1591]: time="2026-04-17T02:50:32.524920764Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Apr 17 02:50:32.525097 containerd[1591]: time="2026-04-17T02:50:32.524940626Z" level=info msg="runtime interface created" Apr 17 02:50:32.525097 containerd[1591]: time="2026-04-17T02:50:32.524945499Z" level=info msg="created NRI interface" Apr 17 02:50:32.525097 containerd[1591]: time="2026-04-17T02:50:32.524953565Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Apr 17 02:50:32.525097 containerd[1591]: time="2026-04-17T02:50:32.524971652Z" level=info msg="Connect containerd service" Apr 17 02:50:32.525097 containerd[1591]: time="2026-04-17T02:50:32.524997384Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Apr 17 02:50:32.526518 containerd[1591]: time="2026-04-17T02:50:32.525997335Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 17 02:50:32.668593 containerd[1591]: time="2026-04-17T02:50:32.668096447Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Apr 17 02:50:32.668593 containerd[1591]: time="2026-04-17T02:50:32.668185256Z" level=info msg=serving... address=/run/containerd/containerd.sock Apr 17 02:50:32.668593 containerd[1591]: time="2026-04-17T02:50:32.668231497Z" level=info msg="Start subscribing containerd event" Apr 17 02:50:32.668593 containerd[1591]: time="2026-04-17T02:50:32.668271532Z" level=info msg="Start recovering state" Apr 17 02:50:32.668593 containerd[1591]: time="2026-04-17T02:50:32.668398732Z" level=info msg="Start event monitor" Apr 17 02:50:32.668593 containerd[1591]: time="2026-04-17T02:50:32.668411315Z" level=info msg="Start cni network conf syncer for default" Apr 17 02:50:32.668593 containerd[1591]: time="2026-04-17T02:50:32.668425278Z" level=info msg="Start streaming server" Apr 17 02:50:32.668593 containerd[1591]: time="2026-04-17T02:50:32.668437631Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Apr 17 02:50:32.668593 containerd[1591]: time="2026-04-17T02:50:32.668445352Z" level=info msg="runtime interface starting up..." Apr 17 02:50:32.668593 containerd[1591]: time="2026-04-17T02:50:32.668451763Z" level=info msg="starting plugins..." Apr 17 02:50:32.668593 containerd[1591]: time="2026-04-17T02:50:32.668466870Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Apr 17 02:50:32.669004 systemd[1]: Started containerd.service - containerd container runtime. Apr 17 02:50:32.673070 containerd[1591]: time="2026-04-17T02:50:32.673022084Z" level=info msg="containerd successfully booted in 0.198746s" Apr 17 02:50:32.775505 tar[1589]: linux-amd64/README.md Apr 17 02:50:32.806903 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Apr 17 02:50:33.319681 systemd-networkd[1476]: eth0: Gained IPv6LL Apr 17 02:50:33.324194 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Apr 17 02:50:33.331131 systemd[1]: Reached target network-online.target - Network is Online. Apr 17 02:50:33.341139 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Apr 17 02:50:33.347605 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 02:50:33.368195 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Apr 17 02:50:33.393452 systemd[1]: coreos-metadata.service: Deactivated successfully. Apr 17 02:50:33.393884 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Apr 17 02:50:33.441450 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Apr 17 02:50:33.488345 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Apr 17 02:50:33.889576 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Apr 17 02:50:33.894018 systemd[1]: Started sshd@0-10.0.0.21:22-10.0.0.1:52164.service - OpenSSH per-connection server daemon (10.0.0.1:52164). Apr 17 02:50:34.110204 sshd[1688]: Accepted publickey for core from 10.0.0.1 port 52164 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:50:34.112440 sshd-session[1688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:50:34.121304 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Apr 17 02:50:34.125119 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Apr 17 02:50:34.138952 systemd-logind[1579]: New session 1 of user core. Apr 17 02:50:34.152689 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Apr 17 02:50:34.159092 systemd[1]: Starting user@500.service - User Manager for UID 500... Apr 17 02:50:34.176930 (systemd)[1693]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Apr 17 02:50:34.187767 systemd-logind[1579]: New session c1 of user core. Apr 17 02:50:34.477311 systemd[1693]: Queued start job for default target default.target. Apr 17 02:50:34.489239 systemd[1693]: Created slice app.slice - User Application Slice. Apr 17 02:50:34.489279 systemd[1693]: Reached target paths.target - Paths. Apr 17 02:50:34.489320 systemd[1693]: Reached target timers.target - Timers. Apr 17 02:50:34.492506 systemd[1693]: Starting dbus.socket - D-Bus User Message Bus Socket... Apr 17 02:50:34.541933 systemd[1693]: Listening on dbus.socket - D-Bus User Message Bus Socket. Apr 17 02:50:34.542023 systemd[1693]: Reached target sockets.target - Sockets. Apr 17 02:50:34.542058 systemd[1693]: Reached target basic.target - Basic System. Apr 17 02:50:34.542081 systemd[1693]: Reached target default.target - Main User Target. Apr 17 02:50:34.542101 systemd[1693]: Startup finished in 293ms. Apr 17 02:50:34.542610 systemd[1]: Started user@500.service - User Manager for UID 500. Apr 17 02:50:34.562129 systemd[1]: Started session-1.scope - Session 1 of User core. Apr 17 02:50:34.635056 systemd[1]: Started sshd@1-10.0.0.21:22-10.0.0.1:52172.service - OpenSSH per-connection server daemon (10.0.0.1:52172). Apr 17 02:50:34.739591 sshd[1704]: Accepted publickey for core from 10.0.0.1 port 52172 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:50:34.741170 sshd-session[1704]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:50:34.764845 systemd-logind[1579]: New session 2 of user core. Apr 17 02:50:34.775529 systemd[1]: Started session-2.scope - Session 2 of User core. Apr 17 02:50:34.838127 sshd[1707]: Connection closed by 10.0.0.1 port 52172 Apr 17 02:50:34.837963 sshd-session[1704]: pam_unix(sshd:session): session closed for user core Apr 17 02:50:34.847633 systemd[1]: sshd@1-10.0.0.21:22-10.0.0.1:52172.service: Deactivated successfully. Apr 17 02:50:34.853482 systemd[1]: session-2.scope: Deactivated successfully. Apr 17 02:50:34.854682 systemd-logind[1579]: Session 2 logged out. Waiting for processes to exit. Apr 17 02:50:34.858326 systemd[1]: Started sshd@2-10.0.0.21:22-10.0.0.1:52180.service - OpenSSH per-connection server daemon (10.0.0.1:52180). Apr 17 02:50:34.863830 systemd-logind[1579]: Removed session 2. Apr 17 02:50:34.950779 sshd[1713]: Accepted publickey for core from 10.0.0.1 port 52180 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:50:34.955899 sshd-session[1713]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:50:34.969757 systemd-logind[1579]: New session 3 of user core. Apr 17 02:50:34.981908 systemd[1]: Started session-3.scope - Session 3 of User core. Apr 17 02:50:35.035807 sshd[1716]: Connection closed by 10.0.0.1 port 52180 Apr 17 02:50:35.036974 sshd-session[1713]: pam_unix(sshd:session): session closed for user core Apr 17 02:50:35.044059 systemd[1]: sshd@2-10.0.0.21:22-10.0.0.1:52180.service: Deactivated successfully. Apr 17 02:50:35.047136 systemd[1]: session-3.scope: Deactivated successfully. Apr 17 02:50:35.053588 systemd-logind[1579]: Session 3 logged out. Waiting for processes to exit. Apr 17 02:50:35.063350 systemd-logind[1579]: Removed session 3. Apr 17 02:50:35.264087 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 02:50:35.267236 systemd[1]: Reached target multi-user.target - Multi-User System. Apr 17 02:50:35.273402 systemd[1]: Startup finished in 4.562s (kernel) + 8.954s (initrd) + 7.498s (userspace) = 21.015s. Apr 17 02:50:35.280374 (kubelet)[1726]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 17 02:50:36.420989 kubelet[1726]: E0417 02:50:36.420605 1726 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 17 02:50:36.424251 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 17 02:50:36.424410 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 17 02:50:36.424846 systemd[1]: kubelet.service: Consumed 1.228s CPU time, 258.1M memory peak. Apr 17 02:50:45.051413 systemd[1]: Started sshd@3-10.0.0.21:22-10.0.0.1:36332.service - OpenSSH per-connection server daemon (10.0.0.1:36332). Apr 17 02:50:45.162874 sshd[1740]: Accepted publickey for core from 10.0.0.1 port 36332 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:50:45.165416 sshd-session[1740]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:50:45.181508 systemd-logind[1579]: New session 4 of user core. Apr 17 02:50:45.194781 systemd[1]: Started session-4.scope - Session 4 of User core. Apr 17 02:50:45.241203 sshd[1743]: Connection closed by 10.0.0.1 port 36332 Apr 17 02:50:45.239398 sshd-session[1740]: pam_unix(sshd:session): session closed for user core Apr 17 02:50:45.265327 systemd[1]: sshd@3-10.0.0.21:22-10.0.0.1:36332.service: Deactivated successfully. Apr 17 02:50:45.271102 systemd[1]: session-4.scope: Deactivated successfully. Apr 17 02:50:45.274975 systemd-logind[1579]: Session 4 logged out. Waiting for processes to exit. Apr 17 02:50:45.279145 systemd[1]: Started sshd@4-10.0.0.21:22-10.0.0.1:36348.service - OpenSSH per-connection server daemon (10.0.0.1:36348). Apr 17 02:50:45.287135 systemd-logind[1579]: Removed session 4. Apr 17 02:50:45.393016 sshd[1749]: Accepted publickey for core from 10.0.0.1 port 36348 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:50:45.394924 sshd-session[1749]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:50:45.405299 systemd-logind[1579]: New session 5 of user core. Apr 17 02:50:45.418206 systemd[1]: Started session-5.scope - Session 5 of User core. Apr 17 02:50:45.453244 sshd[1752]: Connection closed by 10.0.0.1 port 36348 Apr 17 02:50:45.453640 sshd-session[1749]: pam_unix(sshd:session): session closed for user core Apr 17 02:50:45.488490 systemd[1]: sshd@4-10.0.0.21:22-10.0.0.1:36348.service: Deactivated successfully. Apr 17 02:50:45.495043 systemd[1]: session-5.scope: Deactivated successfully. Apr 17 02:50:45.495912 systemd-logind[1579]: Session 5 logged out. Waiting for processes to exit. Apr 17 02:50:45.500608 systemd[1]: Started sshd@5-10.0.0.21:22-10.0.0.1:36362.service - OpenSSH per-connection server daemon (10.0.0.1:36362). Apr 17 02:50:45.525937 systemd-logind[1579]: Removed session 5. Apr 17 02:50:45.692591 sshd[1758]: Accepted publickey for core from 10.0.0.1 port 36362 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:50:45.695371 sshd-session[1758]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:50:45.710916 systemd-logind[1579]: New session 6 of user core. Apr 17 02:50:45.721367 systemd[1]: Started session-6.scope - Session 6 of User core. Apr 17 02:50:45.745361 sshd[1761]: Connection closed by 10.0.0.1 port 36362 Apr 17 02:50:45.746678 sshd-session[1758]: pam_unix(sshd:session): session closed for user core Apr 17 02:50:45.755244 systemd[1]: sshd@5-10.0.0.21:22-10.0.0.1:36362.service: Deactivated successfully. Apr 17 02:50:45.757341 systemd[1]: session-6.scope: Deactivated successfully. Apr 17 02:50:45.758749 systemd-logind[1579]: Session 6 logged out. Waiting for processes to exit. Apr 17 02:50:45.761274 systemd[1]: Started sshd@6-10.0.0.21:22-10.0.0.1:36374.service - OpenSSH per-connection server daemon (10.0.0.1:36374). Apr 17 02:50:45.763224 systemd-logind[1579]: Removed session 6. Apr 17 02:50:45.914754 sshd[1767]: Accepted publickey for core from 10.0.0.1 port 36374 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:50:45.916360 sshd-session[1767]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:50:45.924416 systemd-logind[1579]: New session 7 of user core. Apr 17 02:50:45.935658 systemd[1]: Started session-7.scope - Session 7 of User core. Apr 17 02:50:45.965127 sudo[1771]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Apr 17 02:50:45.965419 sudo[1771]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 17 02:50:46.049702 sudo[1771]: pam_unix(sudo:session): session closed for user root Apr 17 02:50:46.057026 sshd[1770]: Connection closed by 10.0.0.1 port 36374 Apr 17 02:50:46.056004 sshd-session[1767]: pam_unix(sshd:session): session closed for user core Apr 17 02:50:46.075372 systemd[1]: sshd@6-10.0.0.21:22-10.0.0.1:36374.service: Deactivated successfully. Apr 17 02:50:46.077031 systemd[1]: session-7.scope: Deactivated successfully. Apr 17 02:50:46.078112 systemd-logind[1579]: Session 7 logged out. Waiting for processes to exit. Apr 17 02:50:46.081868 systemd[1]: Started sshd@7-10.0.0.21:22-10.0.0.1:36380.service - OpenSSH per-connection server daemon (10.0.0.1:36380). Apr 17 02:50:46.086254 systemd-logind[1579]: Removed session 7. Apr 17 02:50:46.199223 sshd[1777]: Accepted publickey for core from 10.0.0.1 port 36380 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:50:46.203182 sshd-session[1777]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:50:46.220355 systemd-logind[1579]: New session 8 of user core. Apr 17 02:50:46.241110 systemd[1]: Started session-8.scope - Session 8 of User core. Apr 17 02:50:46.261443 sudo[1782]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Apr 17 02:50:46.261664 sudo[1782]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 17 02:50:46.278628 sudo[1782]: pam_unix(sudo:session): session closed for user root Apr 17 02:50:46.295560 sudo[1781]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Apr 17 02:50:46.317402 sudo[1781]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 17 02:50:46.342560 systemd[1]: Starting audit-rules.service - Load Audit Rules... Apr 17 02:50:46.455560 augenrules[1804]: No rules Apr 17 02:50:46.459400 systemd[1]: audit-rules.service: Deactivated successfully. Apr 17 02:50:46.459694 systemd[1]: Finished audit-rules.service - Load Audit Rules. Apr 17 02:50:46.461154 sudo[1781]: pam_unix(sudo:session): session closed for user root Apr 17 02:50:46.463965 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Apr 17 02:50:46.464239 sshd[1780]: Connection closed by 10.0.0.1 port 36380 Apr 17 02:50:46.466343 sshd-session[1777]: pam_unix(sshd:session): session closed for user core Apr 17 02:50:46.507611 systemd[1]: sshd@7-10.0.0.21:22-10.0.0.1:36380.service: Deactivated successfully. Apr 17 02:50:46.513348 systemd[1]: session-8.scope: Deactivated successfully. Apr 17 02:50:46.515444 systemd-logind[1579]: Session 8 logged out. Waiting for processes to exit. Apr 17 02:50:46.524254 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 02:50:46.525684 systemd[1]: Started sshd@8-10.0.0.21:22-10.0.0.1:36394.service - OpenSSH per-connection server daemon (10.0.0.1:36394). Apr 17 02:50:46.527124 systemd-logind[1579]: Removed session 8. Apr 17 02:50:46.650320 sshd[1814]: Accepted publickey for core from 10.0.0.1 port 36394 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:50:46.652094 sshd-session[1814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:50:46.665773 systemd-logind[1579]: New session 9 of user core. Apr 17 02:50:46.681551 systemd[1]: Started session-9.scope - Session 9 of User core. Apr 17 02:50:46.746265 sudo[1820]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Apr 17 02:50:46.746544 sudo[1820]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 17 02:50:46.812229 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 02:50:46.830309 (kubelet)[1830]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 17 02:50:46.972248 kubelet[1830]: E0417 02:50:46.971539 1830 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 17 02:50:46.988096 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 17 02:50:46.988286 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 17 02:50:46.990049 systemd[1]: kubelet.service: Consumed 301ms CPU time, 110.5M memory peak. Apr 17 02:50:47.589561 systemd[1]: Starting docker.service - Docker Application Container Engine... Apr 17 02:50:47.606217 (dockerd)[1854]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Apr 17 02:50:48.442146 dockerd[1854]: time="2026-04-17T02:50:48.441224711Z" level=info msg="Starting up" Apr 17 02:50:48.445020 dockerd[1854]: time="2026-04-17T02:50:48.444922685Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Apr 17 02:50:48.488008 dockerd[1854]: time="2026-04-17T02:50:48.487919862Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Apr 17 02:50:48.769236 dockerd[1854]: time="2026-04-17T02:50:48.767414664Z" level=info msg="Loading containers: start." Apr 17 02:50:48.801701 kernel: Initializing XFRM netlink socket Apr 17 02:50:50.256451 systemd-networkd[1476]: docker0: Link UP Apr 17 02:50:50.271847 dockerd[1854]: time="2026-04-17T02:50:50.271364054Z" level=info msg="Loading containers: done." Apr 17 02:50:50.320268 dockerd[1854]: time="2026-04-17T02:50:50.319534908Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Apr 17 02:50:50.321442 dockerd[1854]: time="2026-04-17T02:50:50.321331714Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Apr 17 02:50:50.321636 dockerd[1854]: time="2026-04-17T02:50:50.321546970Z" level=info msg="Initializing buildkit" Apr 17 02:50:50.479548 dockerd[1854]: time="2026-04-17T02:50:50.479183892Z" level=info msg="Completed buildkit initialization" Apr 17 02:50:50.489161 dockerd[1854]: time="2026-04-17T02:50:50.488283208Z" level=info msg="Daemon has completed initialization" Apr 17 02:50:50.489161 dockerd[1854]: time="2026-04-17T02:50:50.488910888Z" level=info msg="API listen on /run/docker.sock" Apr 17 02:50:50.488952 systemd[1]: Started docker.service - Docker Application Container Engine. Apr 17 02:50:52.160495 containerd[1591]: time="2026-04-17T02:50:52.160414762Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.7\"" Apr 17 02:50:53.558424 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2612952079.mount: Deactivated successfully. Apr 17 02:50:55.483642 containerd[1591]: time="2026-04-17T02:50:55.482148922Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:55.483642 containerd[1591]: time="2026-04-17T02:50:55.483076897Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.7: active requests=0, bytes read=27099952" Apr 17 02:50:55.484662 containerd[1591]: time="2026-04-17T02:50:55.484255626Z" level=info msg="ImageCreate event name:\"sha256:c15709457ff55a861a7259eb631c447f9bf906267615f9d8dcc820635a0bfb95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:55.492924 containerd[1591]: time="2026-04-17T02:50:55.492825517Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:b96b8464d152a24c81d7f0435fd2198f8486970cd26a9e0e9c20826c73d1441c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:55.498115 containerd[1591]: time="2026-04-17T02:50:55.497740318Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.7\" with image id \"sha256:c15709457ff55a861a7259eb631c447f9bf906267615f9d8dcc820635a0bfb95\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:b96b8464d152a24c81d7f0435fd2198f8486970cd26a9e0e9c20826c73d1441c\", size \"27097113\" in 3.337236299s" Apr 17 02:50:55.498332 containerd[1591]: time="2026-04-17T02:50:55.498178005Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.7\" returns image reference \"sha256:c15709457ff55a861a7259eb631c447f9bf906267615f9d8dcc820635a0bfb95\"" Apr 17 02:50:55.499759 containerd[1591]: time="2026-04-17T02:50:55.499624056Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.7\"" Apr 17 02:50:56.904289 containerd[1591]: time="2026-04-17T02:50:56.904195182Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:56.905243 containerd[1591]: time="2026-04-17T02:50:56.905201247Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.7: active requests=0, bytes read=21252670" Apr 17 02:50:56.908003 containerd[1591]: time="2026-04-17T02:50:56.907945817Z" level=info msg="ImageCreate event name:\"sha256:23986a24c803336f2a2dfbcaaf0547ee8bcf6638f23bec8967e210909d00a97a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:56.912401 containerd[1591]: time="2026-04-17T02:50:56.912337584Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7d759bdc4fef10a3fc1ad60ce9439d58e1a4df7ebb22751f7cc0201ce55f280b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:56.913992 containerd[1591]: time="2026-04-17T02:50:56.913936965Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.7\" with image id \"sha256:23986a24c803336f2a2dfbcaaf0547ee8bcf6638f23bec8967e210909d00a97a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7d759bdc4fef10a3fc1ad60ce9439d58e1a4df7ebb22751f7cc0201ce55f280b\", size \"22819085\" in 1.414273772s" Apr 17 02:50:56.914096 containerd[1591]: time="2026-04-17T02:50:56.913997761Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.7\" returns image reference \"sha256:23986a24c803336f2a2dfbcaaf0547ee8bcf6638f23bec8967e210909d00a97a\"" Apr 17 02:50:56.915175 containerd[1591]: time="2026-04-17T02:50:56.914907860Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.7\"" Apr 17 02:50:57.239470 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Apr 17 02:50:57.241817 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 02:50:57.487839 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 02:50:57.543120 (kubelet)[2146]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 17 02:50:57.637753 kubelet[2146]: E0417 02:50:57.637621 2146 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 17 02:50:57.640703 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 17 02:50:57.640952 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 17 02:50:57.641429 systemd[1]: kubelet.service: Consumed 256ms CPU time, 110.1M memory peak. Apr 17 02:50:58.054829 containerd[1591]: time="2026-04-17T02:50:58.054217657Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:58.059795 containerd[1591]: time="2026-04-17T02:50:58.059699340Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.7: active requests=0, bytes read=15810823" Apr 17 02:50:58.062412 containerd[1591]: time="2026-04-17T02:50:58.062329354Z" level=info msg="ImageCreate event name:\"sha256:568f1856b0e1c464b0b50ab2879ebd535623c1a620b1d2530ba5dd594237dc82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:58.067046 containerd[1591]: time="2026-04-17T02:50:58.066972107Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:4ab32f707ff84beaac431797999707757b885196b0b9a52d29cb67f95efce7c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:58.067751 containerd[1591]: time="2026-04-17T02:50:58.067695001Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.7\" with image id \"sha256:568f1856b0e1c464b0b50ab2879ebd535623c1a620b1d2530ba5dd594237dc82\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:4ab32f707ff84beaac431797999707757b885196b0b9a52d29cb67f95efce7c1\", size \"17377256\" in 1.152757859s" Apr 17 02:50:58.067751 containerd[1591]: time="2026-04-17T02:50:58.067748006Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.7\" returns image reference \"sha256:568f1856b0e1c464b0b50ab2879ebd535623c1a620b1d2530ba5dd594237dc82\"" Apr 17 02:50:58.068395 containerd[1591]: time="2026-04-17T02:50:58.068361598Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.7\"" Apr 17 02:50:59.232675 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2523452804.mount: Deactivated successfully. Apr 17 02:50:59.683677 containerd[1591]: time="2026-04-17T02:50:59.683512047Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:59.685367 containerd[1591]: time="2026-04-17T02:50:59.685169490Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.7: active requests=0, bytes read=25972848" Apr 17 02:50:59.687494 containerd[1591]: time="2026-04-17T02:50:59.687319584Z" level=info msg="ImageCreate event name:\"sha256:345c2b8919907fbb425a843da24d86a16708ee53a49ad3fa2e6dc229c7b34643\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:59.692841 containerd[1591]: time="2026-04-17T02:50:59.692518555Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:062519bc0a14769e2f98c6bdff7816a17e6252de3f3c9cb102e6be33fe38d9e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:50:59.693787 containerd[1591]: time="2026-04-17T02:50:59.693695908Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.7\" with image id \"sha256:345c2b8919907fbb425a843da24d86a16708ee53a49ad3fa2e6dc229c7b34643\", repo tag \"registry.k8s.io/kube-proxy:v1.34.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:062519bc0a14769e2f98c6bdff7816a17e6252de3f3c9cb102e6be33fe38d9e2\", size \"25971973\" in 1.625286618s" Apr 17 02:50:59.693787 containerd[1591]: time="2026-04-17T02:50:59.693784015Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.7\" returns image reference \"sha256:345c2b8919907fbb425a843da24d86a16708ee53a49ad3fa2e6dc229c7b34643\"" Apr 17 02:50:59.694379 containerd[1591]: time="2026-04-17T02:50:59.694338639Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Apr 17 02:51:00.321959 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount529661441.mount: Deactivated successfully. Apr 17 02:51:01.441778 containerd[1591]: time="2026-04-17T02:51:01.441581968Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:01.442959 containerd[1591]: time="2026-04-17T02:51:01.442885954Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22387483" Apr 17 02:51:01.445578 containerd[1591]: time="2026-04-17T02:51:01.445502595Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:01.449304 containerd[1591]: time="2026-04-17T02:51:01.449243125Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:01.452087 containerd[1591]: time="2026-04-17T02:51:01.452015407Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 1.757642735s" Apr 17 02:51:01.452203 containerd[1591]: time="2026-04-17T02:51:01.452094263Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Apr 17 02:51:01.453236 containerd[1591]: time="2026-04-17T02:51:01.453084620Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Apr 17 02:51:02.006104 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount768741871.mount: Deactivated successfully. Apr 17 02:51:02.016269 containerd[1591]: time="2026-04-17T02:51:02.016169652Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:02.017547 containerd[1591]: time="2026-04-17T02:51:02.017457677Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321150" Apr 17 02:51:02.018753 containerd[1591]: time="2026-04-17T02:51:02.018668170Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:02.021437 containerd[1591]: time="2026-04-17T02:51:02.021338263Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:02.022215 containerd[1591]: time="2026-04-17T02:51:02.022188493Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 568.881728ms" Apr 17 02:51:02.022299 containerd[1591]: time="2026-04-17T02:51:02.022216896Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Apr 17 02:51:02.023032 containerd[1591]: time="2026-04-17T02:51:02.022885966Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Apr 17 02:51:02.644124 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2201782606.mount: Deactivated successfully. Apr 17 02:51:05.528955 kernel: clocksource: Long readout interval, skipping watchdog check: cs_nsec: 2657974767 wd_nsec: 2657973811 Apr 17 02:51:07.382510 containerd[1591]: time="2026-04-17T02:51:07.382379970Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:07.384156 containerd[1591]: time="2026-04-17T02:51:07.384057776Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=22874255" Apr 17 02:51:07.385577 containerd[1591]: time="2026-04-17T02:51:07.385501502Z" level=info msg="ImageCreate event name:\"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:07.394076 containerd[1591]: time="2026-04-17T02:51:07.393750139Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:07.395471 containerd[1591]: time="2026-04-17T02:51:07.395385284Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"22871747\" in 5.372236782s" Apr 17 02:51:07.395471 containerd[1591]: time="2026-04-17T02:51:07.395465578Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\"" Apr 17 02:51:07.706358 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Apr 17 02:51:07.708486 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 02:51:07.978099 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 02:51:08.074968 (kubelet)[2317]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 17 02:51:08.133318 kubelet[2317]: E0417 02:51:08.133209 2317 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 17 02:51:08.137054 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 17 02:51:08.137268 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 17 02:51:08.137638 systemd[1]: kubelet.service: Consumed 283ms CPU time, 110.6M memory peak. Apr 17 02:51:12.187223 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 02:51:12.187480 systemd[1]: kubelet.service: Consumed 283ms CPU time, 110.6M memory peak. Apr 17 02:51:12.191220 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 02:51:12.238790 systemd[1]: Reload requested from client PID 2332 ('systemctl') (unit session-9.scope)... Apr 17 02:51:12.238819 systemd[1]: Reloading... Apr 17 02:51:12.387788 zram_generator::config[2372]: No configuration found. Apr 17 02:51:12.760395 systemd[1]: Reloading finished in 521 ms. Apr 17 02:51:12.853275 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 02:51:12.857545 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 02:51:12.859248 systemd[1]: kubelet.service: Deactivated successfully. Apr 17 02:51:12.860065 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 02:51:12.860133 systemd[1]: kubelet.service: Consumed 149ms CPU time, 98.4M memory peak. Apr 17 02:51:12.863259 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 02:51:13.127088 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 02:51:13.138204 (kubelet)[2425]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 17 02:51:13.202791 kubelet[2425]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 02:51:13.202791 kubelet[2425]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 02:51:13.203333 kubelet[2425]: I0417 02:51:13.202974 2425 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 02:51:14.465020 kubelet[2425]: I0417 02:51:14.464905 2425 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Apr 17 02:51:14.465020 kubelet[2425]: I0417 02:51:14.464988 2425 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 02:51:14.465020 kubelet[2425]: I0417 02:51:14.465040 2425 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 17 02:51:14.466321 kubelet[2425]: I0417 02:51:14.465046 2425 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 02:51:14.466321 kubelet[2425]: I0417 02:51:14.465398 2425 server.go:956] "Client rotation is on, will bootstrap in background" Apr 17 02:51:14.536114 kubelet[2425]: E0417 02:51:14.536007 2425 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.21:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.21:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Apr 17 02:51:14.536335 kubelet[2425]: I0417 02:51:14.536306 2425 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 17 02:51:14.545269 kubelet[2425]: I0417 02:51:14.545109 2425 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 02:51:14.553705 kubelet[2425]: I0417 02:51:14.553657 2425 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 17 02:51:14.556284 kubelet[2425]: I0417 02:51:14.555292 2425 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 02:51:14.556284 kubelet[2425]: I0417 02:51:14.555385 2425 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 02:51:14.556284 kubelet[2425]: I0417 02:51:14.556187 2425 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 02:51:14.556284 kubelet[2425]: I0417 02:51:14.556200 2425 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 02:51:14.556690 kubelet[2425]: I0417 02:51:14.556377 2425 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Apr 17 02:51:14.560191 kubelet[2425]: I0417 02:51:14.560119 2425 state_mem.go:36] "Initialized new in-memory state store" Apr 17 02:51:14.560771 kubelet[2425]: I0417 02:51:14.560680 2425 kubelet.go:475] "Attempting to sync node with API server" Apr 17 02:51:14.560859 kubelet[2425]: I0417 02:51:14.560789 2425 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 02:51:14.560859 kubelet[2425]: I0417 02:51:14.560819 2425 kubelet.go:387] "Adding apiserver pod source" Apr 17 02:51:14.560859 kubelet[2425]: I0417 02:51:14.560840 2425 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 02:51:14.562013 kubelet[2425]: E0417 02:51:14.561945 2425 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.21:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.21:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 02:51:14.562155 kubelet[2425]: E0417 02:51:14.562064 2425 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.21:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.21:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 02:51:14.563903 kubelet[2425]: I0417 02:51:14.563844 2425 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Apr 17 02:51:14.564899 kubelet[2425]: I0417 02:51:14.564607 2425 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 02:51:14.565070 kubelet[2425]: I0417 02:51:14.565032 2425 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 17 02:51:14.565202 kubelet[2425]: W0417 02:51:14.565161 2425 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Apr 17 02:51:14.570307 kubelet[2425]: I0417 02:51:14.570262 2425 server.go:1262] "Started kubelet" Apr 17 02:51:14.571199 kubelet[2425]: I0417 02:51:14.571022 2425 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 02:51:14.571491 kubelet[2425]: I0417 02:51:14.571372 2425 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 02:51:14.571622 kubelet[2425]: I0417 02:51:14.571560 2425 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 17 02:51:14.572681 kubelet[2425]: I0417 02:51:14.572656 2425 server.go:310] "Adding debug handlers to kubelet server" Apr 17 02:51:14.573487 kubelet[2425]: I0417 02:51:14.572781 2425 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 02:51:14.573487 kubelet[2425]: I0417 02:51:14.572906 2425 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 02:51:14.575337 kubelet[2425]: E0417 02:51:14.574362 2425 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.21:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.21:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18a7052e54cb80ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2026-04-17 02:51:14.570207487 +0000 UTC m=+1.427371315,LastTimestamp:2026-04-17 02:51:14.570207487 +0000 UTC m=+1.427371315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Apr 17 02:51:14.575561 kubelet[2425]: I0417 02:51:14.575525 2425 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 17 02:51:14.576702 kubelet[2425]: E0417 02:51:14.576677 2425 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Apr 17 02:51:14.576871 kubelet[2425]: I0417 02:51:14.576853 2425 volume_manager.go:313] "Starting Kubelet Volume Manager" Apr 17 02:51:14.578488 kubelet[2425]: I0417 02:51:14.578438 2425 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 17 02:51:14.578865 kubelet[2425]: I0417 02:51:14.578665 2425 reconciler.go:29] "Reconciler: start to sync state" Apr 17 02:51:14.578955 kubelet[2425]: I0417 02:51:14.578911 2425 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 17 02:51:14.580476 kubelet[2425]: E0417 02:51:14.580373 2425 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 17 02:51:14.581457 kubelet[2425]: E0417 02:51:14.581393 2425 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.21:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.21:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 02:51:14.582415 kubelet[2425]: E0417 02:51:14.581933 2425 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.21:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.21:6443: connect: connection refused" interval="200ms" Apr 17 02:51:14.582415 kubelet[2425]: I0417 02:51:14.582081 2425 factory.go:223] Registration of the containerd container factory successfully Apr 17 02:51:14.582415 kubelet[2425]: I0417 02:51:14.582090 2425 factory.go:223] Registration of the systemd container factory successfully Apr 17 02:51:14.603169 kubelet[2425]: I0417 02:51:14.603050 2425 cpu_manager.go:221] "Starting CPU manager" policy="none" Apr 17 02:51:14.603348 kubelet[2425]: I0417 02:51:14.603247 2425 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Apr 17 02:51:14.603348 kubelet[2425]: I0417 02:51:14.603311 2425 state_mem.go:36] "Initialized new in-memory state store" Apr 17 02:51:14.606682 kubelet[2425]: I0417 02:51:14.606473 2425 policy_none.go:49] "None policy: Start" Apr 17 02:51:14.606844 kubelet[2425]: I0417 02:51:14.606734 2425 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 17 02:51:14.606844 kubelet[2425]: I0417 02:51:14.606760 2425 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 17 02:51:14.608689 kubelet[2425]: I0417 02:51:14.608670 2425 policy_none.go:47] "Start" Apr 17 02:51:14.609925 kubelet[2425]: I0417 02:51:14.609874 2425 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 17 02:51:14.612140 kubelet[2425]: I0417 02:51:14.611922 2425 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 17 02:51:14.612624 kubelet[2425]: I0417 02:51:14.612586 2425 status_manager.go:244] "Starting to sync pod status with apiserver" Apr 17 02:51:14.612647 kubelet[2425]: I0417 02:51:14.612635 2425 kubelet.go:2428] "Starting kubelet main sync loop" Apr 17 02:51:14.612854 kubelet[2425]: E0417 02:51:14.612676 2425 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 17 02:51:14.615009 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Apr 17 02:51:14.616626 kubelet[2425]: E0417 02:51:14.615064 2425 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.21:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.21:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 17 02:51:14.636086 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Apr 17 02:51:14.640096 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Apr 17 02:51:14.654775 kubelet[2425]: E0417 02:51:14.654675 2425 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 02:51:14.654997 kubelet[2425]: I0417 02:51:14.654931 2425 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 02:51:14.654997 kubelet[2425]: I0417 02:51:14.654942 2425 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 02:51:14.655573 kubelet[2425]: I0417 02:51:14.655491 2425 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 02:51:14.657995 kubelet[2425]: E0417 02:51:14.657927 2425 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 17 02:51:14.657995 kubelet[2425]: E0417 02:51:14.657977 2425 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Apr 17 02:51:14.759902 kubelet[2425]: I0417 02:51:14.759645 2425 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Apr 17 02:51:14.760469 kubelet[2425]: E0417 02:51:14.760431 2425 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.21:6443/api/v1/nodes\": dial tcp 10.0.0.21:6443: connect: connection refused" node="localhost" Apr 17 02:51:14.770618 systemd[1]: Created slice kubepods-burstable-pod824fd89300514e351ed3b68d82c665c6.slice - libcontainer container kubepods-burstable-pod824fd89300514e351ed3b68d82c665c6.slice. Apr 17 02:51:14.782407 kubelet[2425]: E0417 02:51:14.782361 2425 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Apr 17 02:51:14.783275 kubelet[2425]: E0417 02:51:14.783135 2425 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.21:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.21:6443: connect: connection refused" interval="400ms" Apr 17 02:51:14.789551 systemd[1]: Created slice kubepods-burstable-podfb1f1c4156fbda8c6ec689191b535672.slice - libcontainer container kubepods-burstable-podfb1f1c4156fbda8c6ec689191b535672.slice. Apr 17 02:51:14.792817 kubelet[2425]: E0417 02:51:14.792791 2425 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Apr 17 02:51:14.795603 systemd[1]: Created slice kubepods-burstable-podc6bb8708a026256e82ca4c5631a78b5a.slice - libcontainer container kubepods-burstable-podc6bb8708a026256e82ca4c5631a78b5a.slice. Apr 17 02:51:14.798165 kubelet[2425]: E0417 02:51:14.797892 2425 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Apr 17 02:51:14.881144 kubelet[2425]: I0417 02:51:14.880628 2425 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c6bb8708a026256e82ca4c5631a78b5a-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"c6bb8708a026256e82ca4c5631a78b5a\") " pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:14.881386 kubelet[2425]: I0417 02:51:14.881122 2425 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c6bb8708a026256e82ca4c5631a78b5a-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"c6bb8708a026256e82ca4c5631a78b5a\") " pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:14.881386 kubelet[2425]: I0417 02:51:14.881243 2425 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c6bb8708a026256e82ca4c5631a78b5a-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c6bb8708a026256e82ca4c5631a78b5a\") " pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:14.881386 kubelet[2425]: I0417 02:51:14.881274 2425 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/824fd89300514e351ed3b68d82c665c6-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"824fd89300514e351ed3b68d82c665c6\") " pod="kube-system/kube-scheduler-localhost" Apr 17 02:51:14.881386 kubelet[2425]: I0417 02:51:14.881293 2425 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb1f1c4156fbda8c6ec689191b535672-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"fb1f1c4156fbda8c6ec689191b535672\") " pod="kube-system/kube-apiserver-localhost" Apr 17 02:51:14.881386 kubelet[2425]: I0417 02:51:14.881310 2425 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb1f1c4156fbda8c6ec689191b535672-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"fb1f1c4156fbda8c6ec689191b535672\") " pod="kube-system/kube-apiserver-localhost" Apr 17 02:51:14.882030 kubelet[2425]: I0417 02:51:14.881328 2425 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb1f1c4156fbda8c6ec689191b535672-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"fb1f1c4156fbda8c6ec689191b535672\") " pod="kube-system/kube-apiserver-localhost" Apr 17 02:51:14.882030 kubelet[2425]: I0417 02:51:14.881353 2425 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c6bb8708a026256e82ca4c5631a78b5a-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"c6bb8708a026256e82ca4c5631a78b5a\") " pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:14.882030 kubelet[2425]: I0417 02:51:14.881371 2425 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c6bb8708a026256e82ca4c5631a78b5a-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c6bb8708a026256e82ca4c5631a78b5a\") " pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:14.964308 kubelet[2425]: I0417 02:51:14.964273 2425 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Apr 17 02:51:14.964731 kubelet[2425]: E0417 02:51:14.964649 2425 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.21:6443/api/v1/nodes\": dial tcp 10.0.0.21:6443: connect: connection refused" node="localhost" Apr 17 02:51:15.088313 kubelet[2425]: E0417 02:51:15.087784 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:15.091601 containerd[1591]: time="2026-04-17T02:51:15.091273619Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:824fd89300514e351ed3b68d82c665c6,Namespace:kube-system,Attempt:0,}" Apr 17 02:51:15.097986 kubelet[2425]: E0417 02:51:15.097897 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:15.099520 containerd[1591]: time="2026-04-17T02:51:15.099461361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:fb1f1c4156fbda8c6ec689191b535672,Namespace:kube-system,Attempt:0,}" Apr 17 02:51:15.102986 kubelet[2425]: E0417 02:51:15.102224 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:15.104413 containerd[1591]: time="2026-04-17T02:51:15.104174531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:c6bb8708a026256e82ca4c5631a78b5a,Namespace:kube-system,Attempt:0,}" Apr 17 02:51:15.185566 kubelet[2425]: E0417 02:51:15.185417 2425 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.21:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.21:6443: connect: connection refused" interval="800ms" Apr 17 02:51:15.368258 kubelet[2425]: I0417 02:51:15.367878 2425 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Apr 17 02:51:15.369390 kubelet[2425]: E0417 02:51:15.369229 2425 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.21:6443/api/v1/nodes\": dial tcp 10.0.0.21:6443: connect: connection refused" node="localhost" Apr 17 02:51:15.607923 kubelet[2425]: E0417 02:51:15.607505 2425 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.21:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.21:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 17 02:51:15.640274 kubelet[2425]: E0417 02:51:15.639983 2425 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.21:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.21:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 02:51:15.659636 kubelet[2425]: E0417 02:51:15.659341 2425 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.21:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.21:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 02:51:15.807536 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2670343586.mount: Deactivated successfully. Apr 17 02:51:15.825054 containerd[1591]: time="2026-04-17T02:51:15.824978143Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 17 02:51:15.830146 containerd[1591]: time="2026-04-17T02:51:15.829829834Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321070" Apr 17 02:51:15.831848 containerd[1591]: time="2026-04-17T02:51:15.831781238Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 17 02:51:15.834300 containerd[1591]: time="2026-04-17T02:51:15.834233103Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 17 02:51:15.835487 containerd[1591]: time="2026-04-17T02:51:15.835352291Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Apr 17 02:51:15.839544 containerd[1591]: time="2026-04-17T02:51:15.839489820Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 17 02:51:15.840494 containerd[1591]: time="2026-04-17T02:51:15.840427098Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 746.214738ms" Apr 17 02:51:15.841433 containerd[1591]: time="2026-04-17T02:51:15.841368427Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 17 02:51:15.841885 containerd[1591]: time="2026-04-17T02:51:15.841828979Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Apr 17 02:51:15.846900 containerd[1591]: time="2026-04-17T02:51:15.846850840Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 744.420952ms" Apr 17 02:51:15.848769 containerd[1591]: time="2026-04-17T02:51:15.848029148Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 739.946419ms" Apr 17 02:51:15.872164 containerd[1591]: time="2026-04-17T02:51:15.872082288Z" level=info msg="connecting to shim 34db2668ce07c2adde60359433cc9d1636cd629e27a322f34a52e98210ef9454" address="unix:///run/containerd/s/4a4fd9f537c4caa376092485e279766ee63f21825d4cad29fa2f8f0657ccbe02" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:51:15.974676 containerd[1591]: time="2026-04-17T02:51:15.974498763Z" level=info msg="connecting to shim e13303455056b094dffcae933e74ebc857d5fddda90b6c6410d8594467b93bf2" address="unix:///run/containerd/s/3c1792a23710233f8c74417fcc514dbb08d5ffbceed5be5d823f43e7e632ede9" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:51:16.827046 containerd[1591]: time="2026-04-17T02:51:16.825162471Z" level=info msg="connecting to shim 0dda3cd6f119ebba1ff1d2be0acac79d200a87607faabc2f2409103ba508cb72" address="unix:///run/containerd/s/af5e522cc5b66b34a1aa46b705fe0d8b688618370583f2e6621fd953f02b1046" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:51:16.829330 kubelet[2425]: E0417 02:51:16.827968 2425 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.21:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.21:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Apr 17 02:51:16.829330 kubelet[2425]: E0417 02:51:16.828050 2425 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.21:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.21:6443: connect: connection refused" interval="1.6s" Apr 17 02:51:16.829330 kubelet[2425]: E0417 02:51:16.829043 2425 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.21:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.21:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 02:51:16.829330 kubelet[2425]: E0417 02:51:16.829235 2425 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.21:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.21:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18a7052e54cb80ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2026-04-17 02:51:14.570207487 +0000 UTC m=+1.427371315,LastTimestamp:2026-04-17 02:51:14.570207487 +0000 UTC m=+1.427371315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Apr 17 02:51:16.831618 kubelet[2425]: I0417 02:51:16.831572 2425 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Apr 17 02:51:16.832003 kubelet[2425]: E0417 02:51:16.831952 2425 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.21:6443/api/v1/nodes\": dial tcp 10.0.0.21:6443: connect: connection refused" node="localhost" Apr 17 02:51:16.881756 systemd[1]: Started cri-containerd-0dda3cd6f119ebba1ff1d2be0acac79d200a87607faabc2f2409103ba508cb72.scope - libcontainer container 0dda3cd6f119ebba1ff1d2be0acac79d200a87607faabc2f2409103ba508cb72. Apr 17 02:51:16.903934 systemd[1]: Started cri-containerd-e13303455056b094dffcae933e74ebc857d5fddda90b6c6410d8594467b93bf2.scope - libcontainer container e13303455056b094dffcae933e74ebc857d5fddda90b6c6410d8594467b93bf2. Apr 17 02:51:16.909478 systemd[1]: Started cri-containerd-34db2668ce07c2adde60359433cc9d1636cd629e27a322f34a52e98210ef9454.scope - libcontainer container 34db2668ce07c2adde60359433cc9d1636cd629e27a322f34a52e98210ef9454. Apr 17 02:51:17.046466 containerd[1591]: time="2026-04-17T02:51:17.044053066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:fb1f1c4156fbda8c6ec689191b535672,Namespace:kube-system,Attempt:0,} returns sandbox id \"e13303455056b094dffcae933e74ebc857d5fddda90b6c6410d8594467b93bf2\"" Apr 17 02:51:17.047100 kubelet[2425]: E0417 02:51:17.046926 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:17.053651 containerd[1591]: time="2026-04-17T02:51:17.053606084Z" level=info msg="CreateContainer within sandbox \"e13303455056b094dffcae933e74ebc857d5fddda90b6c6410d8594467b93bf2\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Apr 17 02:51:17.054777 containerd[1591]: time="2026-04-17T02:51:17.054599127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:824fd89300514e351ed3b68d82c665c6,Namespace:kube-system,Attempt:0,} returns sandbox id \"34db2668ce07c2adde60359433cc9d1636cd629e27a322f34a52e98210ef9454\"" Apr 17 02:51:17.055933 kubelet[2425]: E0417 02:51:17.055908 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:17.056178 containerd[1591]: time="2026-04-17T02:51:17.056145329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:c6bb8708a026256e82ca4c5631a78b5a,Namespace:kube-system,Attempt:0,} returns sandbox id \"0dda3cd6f119ebba1ff1d2be0acac79d200a87607faabc2f2409103ba508cb72\"" Apr 17 02:51:17.057415 kubelet[2425]: E0417 02:51:17.057373 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:17.060950 containerd[1591]: time="2026-04-17T02:51:17.060847050Z" level=info msg="CreateContainer within sandbox \"34db2668ce07c2adde60359433cc9d1636cd629e27a322f34a52e98210ef9454\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Apr 17 02:51:17.062509 containerd[1591]: time="2026-04-17T02:51:17.062463938Z" level=info msg="CreateContainer within sandbox \"0dda3cd6f119ebba1ff1d2be0acac79d200a87607faabc2f2409103ba508cb72\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Apr 17 02:51:17.068664 containerd[1591]: time="2026-04-17T02:51:17.068626365Z" level=info msg="Container f5db9cb312778702aea06d6cb98afb8b5ca3480886b541ef5bfbd31af1cc01e4: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:51:17.077302 containerd[1591]: time="2026-04-17T02:51:17.077192096Z" level=info msg="Container 0078b4ad2ea05b6c8341c641459daae476d9c724f52e5f45af1d51b6c1dbc049: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:51:17.084894 containerd[1591]: time="2026-04-17T02:51:17.084689530Z" level=info msg="Container 224b84b170a43575538ea399b6c024e7f6e7a7911065418d1f5e8a0a3bcc9ae7: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:51:17.086560 containerd[1591]: time="2026-04-17T02:51:17.086511201Z" level=info msg="CreateContainer within sandbox \"e13303455056b094dffcae933e74ebc857d5fddda90b6c6410d8594467b93bf2\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f5db9cb312778702aea06d6cb98afb8b5ca3480886b541ef5bfbd31af1cc01e4\"" Apr 17 02:51:17.087469 containerd[1591]: time="2026-04-17T02:51:17.087410106Z" level=info msg="StartContainer for \"f5db9cb312778702aea06d6cb98afb8b5ca3480886b541ef5bfbd31af1cc01e4\"" Apr 17 02:51:17.089944 containerd[1591]: time="2026-04-17T02:51:17.089845430Z" level=info msg="connecting to shim f5db9cb312778702aea06d6cb98afb8b5ca3480886b541ef5bfbd31af1cc01e4" address="unix:///run/containerd/s/3c1792a23710233f8c74417fcc514dbb08d5ffbceed5be5d823f43e7e632ede9" protocol=ttrpc version=3 Apr 17 02:51:17.093410 containerd[1591]: time="2026-04-17T02:51:17.092849315Z" level=info msg="CreateContainer within sandbox \"34db2668ce07c2adde60359433cc9d1636cd629e27a322f34a52e98210ef9454\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"0078b4ad2ea05b6c8341c641459daae476d9c724f52e5f45af1d51b6c1dbc049\"" Apr 17 02:51:17.098326 containerd[1591]: time="2026-04-17T02:51:17.098261114Z" level=info msg="StartContainer for \"0078b4ad2ea05b6c8341c641459daae476d9c724f52e5f45af1d51b6c1dbc049\"" Apr 17 02:51:17.107627 containerd[1591]: time="2026-04-17T02:51:17.107060601Z" level=info msg="connecting to shim 0078b4ad2ea05b6c8341c641459daae476d9c724f52e5f45af1d51b6c1dbc049" address="unix:///run/containerd/s/4a4fd9f537c4caa376092485e279766ee63f21825d4cad29fa2f8f0657ccbe02" protocol=ttrpc version=3 Apr 17 02:51:17.112061 containerd[1591]: time="2026-04-17T02:51:17.111896405Z" level=info msg="CreateContainer within sandbox \"0dda3cd6f119ebba1ff1d2be0acac79d200a87607faabc2f2409103ba508cb72\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"224b84b170a43575538ea399b6c024e7f6e7a7911065418d1f5e8a0a3bcc9ae7\"" Apr 17 02:51:17.113875 containerd[1591]: time="2026-04-17T02:51:17.113695223Z" level=info msg="StartContainer for \"224b84b170a43575538ea399b6c024e7f6e7a7911065418d1f5e8a0a3bcc9ae7\"" Apr 17 02:51:17.115544 containerd[1591]: time="2026-04-17T02:51:17.115478192Z" level=info msg="connecting to shim 224b84b170a43575538ea399b6c024e7f6e7a7911065418d1f5e8a0a3bcc9ae7" address="unix:///run/containerd/s/af5e522cc5b66b34a1aa46b705fe0d8b688618370583f2e6621fd953f02b1046" protocol=ttrpc version=3 Apr 17 02:51:17.124926 systemd[1]: Started cri-containerd-f5db9cb312778702aea06d6cb98afb8b5ca3480886b541ef5bfbd31af1cc01e4.scope - libcontainer container f5db9cb312778702aea06d6cb98afb8b5ca3480886b541ef5bfbd31af1cc01e4. Apr 17 02:51:17.131505 systemd[1]: Started cri-containerd-0078b4ad2ea05b6c8341c641459daae476d9c724f52e5f45af1d51b6c1dbc049.scope - libcontainer container 0078b4ad2ea05b6c8341c641459daae476d9c724f52e5f45af1d51b6c1dbc049. Apr 17 02:51:17.178474 systemd[1]: Started cri-containerd-224b84b170a43575538ea399b6c024e7f6e7a7911065418d1f5e8a0a3bcc9ae7.scope - libcontainer container 224b84b170a43575538ea399b6c024e7f6e7a7911065418d1f5e8a0a3bcc9ae7. Apr 17 02:51:17.259328 update_engine[1580]: I20260417 02:51:17.258601 1580 update_attempter.cc:509] Updating boot flags... Apr 17 02:51:17.361798 containerd[1591]: time="2026-04-17T02:51:17.359938600Z" level=info msg="StartContainer for \"f5db9cb312778702aea06d6cb98afb8b5ca3480886b541ef5bfbd31af1cc01e4\" returns successfully" Apr 17 02:51:17.432757 containerd[1591]: time="2026-04-17T02:51:17.432609210Z" level=info msg="StartContainer for \"0078b4ad2ea05b6c8341c641459daae476d9c724f52e5f45af1d51b6c1dbc049\" returns successfully" Apr 17 02:51:17.548761 containerd[1591]: time="2026-04-17T02:51:17.542989966Z" level=info msg="StartContainer for \"224b84b170a43575538ea399b6c024e7f6e7a7911065418d1f5e8a0a3bcc9ae7\" returns successfully" Apr 17 02:51:17.851752 kubelet[2425]: E0417 02:51:17.851007 2425 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Apr 17 02:51:17.851752 kubelet[2425]: E0417 02:51:17.851487 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:17.861369 kubelet[2425]: E0417 02:51:17.861289 2425 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Apr 17 02:51:17.863835 kubelet[2425]: E0417 02:51:17.863764 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:17.871962 kubelet[2425]: E0417 02:51:17.871911 2425 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Apr 17 02:51:17.872128 kubelet[2425]: E0417 02:51:17.872055 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:18.441304 kubelet[2425]: I0417 02:51:18.441198 2425 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Apr 17 02:51:18.874976 kubelet[2425]: E0417 02:51:18.874947 2425 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Apr 17 02:51:18.875706 kubelet[2425]: E0417 02:51:18.875035 2425 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Apr 17 02:51:18.875706 kubelet[2425]: E0417 02:51:18.875615 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:18.875706 kubelet[2425]: E0417 02:51:18.875673 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:19.886189 kubelet[2425]: E0417 02:51:19.886101 2425 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Apr 17 02:51:19.887958 kubelet[2425]: E0417 02:51:19.886702 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:22.251124 kubelet[2425]: E0417 02:51:22.250974 2425 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Apr 17 02:51:22.347634 kubelet[2425]: I0417 02:51:22.347569 2425 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Apr 17 02:51:22.347634 kubelet[2425]: E0417 02:51:22.347622 2425 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Apr 17 02:51:22.382440 kubelet[2425]: I0417 02:51:22.381933 2425 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Apr 17 02:51:22.463362 kubelet[2425]: E0417 02:51:22.463288 2425 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Apr 17 02:51:22.463362 kubelet[2425]: I0417 02:51:22.463351 2425 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Apr 17 02:51:22.474637 kubelet[2425]: E0417 02:51:22.474561 2425 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Apr 17 02:51:22.477599 kubelet[2425]: I0417 02:51:22.477534 2425 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:22.484424 kubelet[2425]: E0417 02:51:22.484129 2425 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:22.856762 kubelet[2425]: I0417 02:51:22.856122 2425 apiserver.go:52] "Watching apiserver" Apr 17 02:51:22.880453 kubelet[2425]: I0417 02:51:22.880142 2425 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 17 02:51:23.166496 kubelet[2425]: I0417 02:51:23.166338 2425 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Apr 17 02:51:23.191362 kubelet[2425]: E0417 02:51:23.191058 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:24.029424 kubelet[2425]: E0417 02:51:24.029382 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:27.675863 kubelet[2425]: I0417 02:51:27.668139 2425 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:27.703628 kubelet[2425]: I0417 02:51:27.703555 2425 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=4.703529572 podStartE2EDuration="4.703529572s" podCreationTimestamp="2026-04-17 02:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 02:51:24.837227483 +0000 UTC m=+11.694391321" watchObservedRunningTime="2026-04-17 02:51:27.703529572 +0000 UTC m=+14.560693410" Apr 17 02:51:27.704687 kubelet[2425]: E0417 02:51:27.704175 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:28.070676 kubelet[2425]: I0417 02:51:28.070162 2425 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Apr 17 02:51:28.070676 kubelet[2425]: E0417 02:51:28.070262 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:28.091789 kubelet[2425]: E0417 02:51:28.091681 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:28.139526 kubelet[2425]: I0417 02:51:28.139418 2425 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.139383064 podStartE2EDuration="1.139383064s" podCreationTimestamp="2026-04-17 02:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 02:51:28.113758516 +0000 UTC m=+14.970922353" watchObservedRunningTime="2026-04-17 02:51:28.139383064 +0000 UTC m=+14.996546912" Apr 17 02:51:28.743179 systemd[1]: Reload requested from client PID 2727 ('systemctl') (unit session-9.scope)... Apr 17 02:51:28.743221 systemd[1]: Reloading... Apr 17 02:51:28.939790 zram_generator::config[2770]: No configuration found. Apr 17 02:51:29.078182 kubelet[2425]: E0417 02:51:29.077698 2425 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:29.397929 systemd[1]: Reloading finished in 654 ms. Apr 17 02:51:29.467661 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 02:51:29.489994 systemd[1]: kubelet.service: Deactivated successfully. Apr 17 02:51:29.490275 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 02:51:29.490346 systemd[1]: kubelet.service: Consumed 3.650s CPU time, 127.4M memory peak. Apr 17 02:51:29.492094 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 02:51:29.795121 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 02:51:29.838662 (kubelet)[2815]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 17 02:51:29.945768 kubelet[2815]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 02:51:29.945768 kubelet[2815]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 02:51:29.946138 kubelet[2815]: I0417 02:51:29.945759 2815 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 02:51:29.969770 kubelet[2815]: I0417 02:51:29.969457 2815 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Apr 17 02:51:29.969770 kubelet[2815]: I0417 02:51:29.969552 2815 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 02:51:29.969770 kubelet[2815]: I0417 02:51:29.969606 2815 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 17 02:51:29.969770 kubelet[2815]: I0417 02:51:29.969616 2815 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 02:51:29.970152 kubelet[2815]: I0417 02:51:29.970016 2815 server.go:956] "Client rotation is on, will bootstrap in background" Apr 17 02:51:29.972799 kubelet[2815]: I0417 02:51:29.972334 2815 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Apr 17 02:51:29.995330 kubelet[2815]: I0417 02:51:29.994979 2815 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 17 02:51:30.038962 kubelet[2815]: I0417 02:51:30.038907 2815 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 02:51:30.048804 kubelet[2815]: I0417 02:51:30.048619 2815 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 17 02:51:30.048941 kubelet[2815]: I0417 02:51:30.048880 2815 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 02:51:30.049208 kubelet[2815]: I0417 02:51:30.048909 2815 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 02:51:30.049208 kubelet[2815]: I0417 02:51:30.049057 2815 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 02:51:30.049208 kubelet[2815]: I0417 02:51:30.049065 2815 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 02:51:30.049208 kubelet[2815]: I0417 02:51:30.049087 2815 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Apr 17 02:51:30.049447 kubelet[2815]: I0417 02:51:30.049303 2815 state_mem.go:36] "Initialized new in-memory state store" Apr 17 02:51:30.049600 kubelet[2815]: I0417 02:51:30.049589 2815 kubelet.go:475] "Attempting to sync node with API server" Apr 17 02:51:30.049620 kubelet[2815]: I0417 02:51:30.049603 2815 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 02:51:30.049769 kubelet[2815]: I0417 02:51:30.049669 2815 kubelet.go:387] "Adding apiserver pod source" Apr 17 02:51:30.049769 kubelet[2815]: I0417 02:51:30.049682 2815 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 02:51:30.052787 kubelet[2815]: I0417 02:51:30.051667 2815 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Apr 17 02:51:30.052787 kubelet[2815]: I0417 02:51:30.052274 2815 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 02:51:30.052787 kubelet[2815]: I0417 02:51:30.052295 2815 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 17 02:51:30.058572 kubelet[2815]: I0417 02:51:30.057852 2815 server.go:1262] "Started kubelet" Apr 17 02:51:30.059633 kubelet[2815]: I0417 02:51:30.059524 2815 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 02:51:30.064984 kubelet[2815]: I0417 02:51:30.064884 2815 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 02:51:30.067969 kubelet[2815]: I0417 02:51:30.066866 2815 server.go:310] "Adding debug handlers to kubelet server" Apr 17 02:51:30.073434 kubelet[2815]: I0417 02:51:30.073245 2815 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 17 02:51:30.081325 kubelet[2815]: I0417 02:51:30.080388 2815 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 02:51:30.081325 kubelet[2815]: I0417 02:51:30.080619 2815 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 17 02:51:30.081325 kubelet[2815]: I0417 02:51:30.081062 2815 volume_manager.go:313] "Starting Kubelet Volume Manager" Apr 17 02:51:30.081325 kubelet[2815]: E0417 02:51:30.081210 2815 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"localhost\" not found" Apr 17 02:51:30.082744 kubelet[2815]: I0417 02:51:30.081617 2815 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 17 02:51:30.082744 kubelet[2815]: I0417 02:51:30.081704 2815 reconciler.go:29] "Reconciler: start to sync state" Apr 17 02:51:30.092372 kubelet[2815]: I0417 02:51:30.091703 2815 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 02:51:30.092889 kubelet[2815]: I0417 02:51:30.092625 2815 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 17 02:51:30.122765 kubelet[2815]: E0417 02:51:30.120813 2815 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 17 02:51:30.122765 kubelet[2815]: I0417 02:51:30.121332 2815 factory.go:223] Registration of the containerd container factory successfully Apr 17 02:51:30.122765 kubelet[2815]: I0417 02:51:30.121341 2815 factory.go:223] Registration of the systemd container factory successfully Apr 17 02:51:30.141461 kubelet[2815]: I0417 02:51:30.141326 2815 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 17 02:51:30.145770 kubelet[2815]: I0417 02:51:30.145614 2815 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 17 02:51:30.145770 kubelet[2815]: I0417 02:51:30.145650 2815 status_manager.go:244] "Starting to sync pod status with apiserver" Apr 17 02:51:30.145770 kubelet[2815]: I0417 02:51:30.145696 2815 kubelet.go:2428] "Starting kubelet main sync loop" Apr 17 02:51:30.146072 kubelet[2815]: E0417 02:51:30.146051 2815 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 17 02:51:30.247313 kubelet[2815]: E0417 02:51:30.247002 2815 kubelet.go:2452] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Apr 17 02:51:30.467488 kubelet[2815]: E0417 02:51:30.465300 2815 kubelet.go:2452] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Apr 17 02:51:30.485763 kubelet[2815]: I0417 02:51:30.485619 2815 cpu_manager.go:221] "Starting CPU manager" policy="none" Apr 17 02:51:30.485763 kubelet[2815]: I0417 02:51:30.485690 2815 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Apr 17 02:51:30.486150 kubelet[2815]: I0417 02:51:30.485804 2815 state_mem.go:36] "Initialized new in-memory state store" Apr 17 02:51:30.486621 kubelet[2815]: I0417 02:51:30.486453 2815 state_mem.go:88] "Updated default CPUSet" cpuSet="" Apr 17 02:51:30.486621 kubelet[2815]: I0417 02:51:30.486489 2815 state_mem.go:96] "Updated CPUSet assignments" assignments={} Apr 17 02:51:30.486621 kubelet[2815]: I0417 02:51:30.486551 2815 policy_none.go:49] "None policy: Start" Apr 17 02:51:30.486621 kubelet[2815]: I0417 02:51:30.486569 2815 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 17 02:51:30.486621 kubelet[2815]: I0417 02:51:30.486579 2815 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 17 02:51:30.486811 kubelet[2815]: I0417 02:51:30.486685 2815 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Apr 17 02:51:30.486811 kubelet[2815]: I0417 02:51:30.486693 2815 policy_none.go:47] "Start" Apr 17 02:51:30.531083 kubelet[2815]: E0417 02:51:30.530987 2815 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 02:51:30.533330 kubelet[2815]: I0417 02:51:30.533279 2815 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 02:51:30.535229 kubelet[2815]: I0417 02:51:30.533343 2815 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 02:51:30.536160 kubelet[2815]: I0417 02:51:30.535821 2815 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 02:51:30.548756 kubelet[2815]: E0417 02:51:30.547588 2815 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 17 02:51:30.667499 kubelet[2815]: I0417 02:51:30.667471 2815 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Apr 17 02:51:30.716053 kubelet[2815]: I0417 02:51:30.716003 2815 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Apr 17 02:51:30.716487 kubelet[2815]: I0417 02:51:30.716476 2815 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Apr 17 02:51:30.881755 kubelet[2815]: I0417 02:51:30.881642 2815 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Apr 17 02:51:30.881974 kubelet[2815]: I0417 02:51:30.881876 2815 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Apr 17 02:51:30.882171 kubelet[2815]: I0417 02:51:30.882160 2815 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:30.900511 kubelet[2815]: I0417 02:51:30.900392 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c6bb8708a026256e82ca4c5631a78b5a-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c6bb8708a026256e82ca4c5631a78b5a\") " pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:30.900511 kubelet[2815]: I0417 02:51:30.900501 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c6bb8708a026256e82ca4c5631a78b5a-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"c6bb8708a026256e82ca4c5631a78b5a\") " pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:30.900511 kubelet[2815]: I0417 02:51:30.900536 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/824fd89300514e351ed3b68d82c665c6-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"824fd89300514e351ed3b68d82c665c6\") " pod="kube-system/kube-scheduler-localhost" Apr 17 02:51:30.900872 kubelet[2815]: I0417 02:51:30.900554 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fb1f1c4156fbda8c6ec689191b535672-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"fb1f1c4156fbda8c6ec689191b535672\") " pod="kube-system/kube-apiserver-localhost" Apr 17 02:51:30.900872 kubelet[2815]: I0417 02:51:30.900574 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fb1f1c4156fbda8c6ec689191b535672-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"fb1f1c4156fbda8c6ec689191b535672\") " pod="kube-system/kube-apiserver-localhost" Apr 17 02:51:30.900872 kubelet[2815]: I0417 02:51:30.900605 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fb1f1c4156fbda8c6ec689191b535672-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"fb1f1c4156fbda8c6ec689191b535672\") " pod="kube-system/kube-apiserver-localhost" Apr 17 02:51:30.900872 kubelet[2815]: I0417 02:51:30.900658 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c6bb8708a026256e82ca4c5631a78b5a-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c6bb8708a026256e82ca4c5631a78b5a\") " pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:30.900872 kubelet[2815]: I0417 02:51:30.900677 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c6bb8708a026256e82ca4c5631a78b5a-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"c6bb8708a026256e82ca4c5631a78b5a\") " pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:30.901000 kubelet[2815]: I0417 02:51:30.900696 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c6bb8708a026256e82ca4c5631a78b5a-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"c6bb8708a026256e82ca4c5631a78b5a\") " pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:30.921274 kubelet[2815]: E0417 02:51:30.921227 2815 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Apr 17 02:51:30.940809 kubelet[2815]: E0417 02:51:30.939979 2815 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Apr 17 02:51:30.942816 kubelet[2815]: E0417 02:51:30.942288 2815 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Apr 17 02:51:31.068772 kubelet[2815]: I0417 02:51:31.068655 2815 apiserver.go:52] "Watching apiserver" Apr 17 02:51:31.082796 kubelet[2815]: I0417 02:51:31.082661 2815 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 17 02:51:31.268914 kubelet[2815]: E0417 02:51:31.268152 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:31.273117 kubelet[2815]: E0417 02:51:31.273002 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:31.274021 kubelet[2815]: E0417 02:51:31.273957 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:31.327318 kubelet[2815]: E0417 02:51:31.327285 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:31.329212 kubelet[2815]: E0417 02:51:31.329011 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:31.334269 kubelet[2815]: E0417 02:51:31.332578 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:32.338784 kubelet[2815]: E0417 02:51:32.337310 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:32.338784 kubelet[2815]: E0417 02:51:32.337357 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:33.350808 kubelet[2815]: E0417 02:51:33.350633 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:33.355245 kubelet[2815]: E0417 02:51:33.355126 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:34.175127 kubelet[2815]: I0417 02:51:34.175098 2815 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Apr 17 02:51:34.178619 containerd[1591]: time="2026-04-17T02:51:34.178500122Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Apr 17 02:51:34.181666 kubelet[2815]: I0417 02:51:34.181640 2815 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Apr 17 02:51:34.369139 kubelet[2815]: E0417 02:51:34.363781 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:34.466819 systemd[1]: Created slice kubepods-besteffort-pod69a524db_b538_4c22_964f_7f978931fbb8.slice - libcontainer container kubepods-besteffort-pod69a524db_b538_4c22_964f_7f978931fbb8.slice. Apr 17 02:51:34.499055 kubelet[2815]: I0417 02:51:34.499003 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz9bl\" (UniqueName: \"kubernetes.io/projected/69a524db-b538-4c22-964f-7f978931fbb8-kube-api-access-rz9bl\") pod \"kube-proxy-qm55r\" (UID: \"69a524db-b538-4c22-964f-7f978931fbb8\") " pod="kube-system/kube-proxy-qm55r" Apr 17 02:51:34.502775 kubelet[2815]: I0417 02:51:34.501640 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/69a524db-b538-4c22-964f-7f978931fbb8-kube-proxy\") pod \"kube-proxy-qm55r\" (UID: \"69a524db-b538-4c22-964f-7f978931fbb8\") " pod="kube-system/kube-proxy-qm55r" Apr 17 02:51:34.502775 kubelet[2815]: I0417 02:51:34.502130 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/69a524db-b538-4c22-964f-7f978931fbb8-xtables-lock\") pod \"kube-proxy-qm55r\" (UID: \"69a524db-b538-4c22-964f-7f978931fbb8\") " pod="kube-system/kube-proxy-qm55r" Apr 17 02:51:34.502775 kubelet[2815]: I0417 02:51:34.502149 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69a524db-b538-4c22-964f-7f978931fbb8-lib-modules\") pod \"kube-proxy-qm55r\" (UID: \"69a524db-b538-4c22-964f-7f978931fbb8\") " pod="kube-system/kube-proxy-qm55r" Apr 17 02:51:34.771019 kubelet[2815]: E0417 02:51:34.768429 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:35.097099 kubelet[2815]: E0417 02:51:35.095579 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:35.099002 containerd[1591]: time="2026-04-17T02:51:35.098973614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qm55r,Uid:69a524db-b538-4c22-964f-7f978931fbb8,Namespace:kube-system,Attempt:0,}" Apr 17 02:51:35.295763 containerd[1591]: time="2026-04-17T02:51:35.295618751Z" level=info msg="connecting to shim aeea2a0df949f00ae4bade436ddf0addb4cbbcdd0ba49e8ca74926306125cdd7" address="unix:///run/containerd/s/09f9a30f60d5ce91acd5c0da7407c674c1c03a4ea913d20eea834543beeea5e4" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:51:35.372031 systemd[1]: Created slice kubepods-besteffort-pod9cc323f4_390a_445b_8689_5d583ecb3f58.slice - libcontainer container kubepods-besteffort-pod9cc323f4_390a_445b_8689_5d583ecb3f58.slice. Apr 17 02:51:35.382961 kubelet[2815]: I0417 02:51:35.382010 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9cc323f4-390a-445b-8689-5d583ecb3f58-var-lib-calico\") pod \"tigera-operator-5588576f44-mzlpz\" (UID: \"9cc323f4-390a-445b-8689-5d583ecb3f58\") " pod="tigera-operator/tigera-operator-5588576f44-mzlpz" Apr 17 02:51:35.382961 kubelet[2815]: I0417 02:51:35.382040 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh5zt\" (UniqueName: \"kubernetes.io/projected/9cc323f4-390a-445b-8689-5d583ecb3f58-kube-api-access-fh5zt\") pod \"tigera-operator-5588576f44-mzlpz\" (UID: \"9cc323f4-390a-445b-8689-5d583ecb3f58\") " pod="tigera-operator/tigera-operator-5588576f44-mzlpz" Apr 17 02:51:35.393576 kubelet[2815]: E0417 02:51:35.392594 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:35.482286 systemd[1]: Started cri-containerd-aeea2a0df949f00ae4bade436ddf0addb4cbbcdd0ba49e8ca74926306125cdd7.scope - libcontainer container aeea2a0df949f00ae4bade436ddf0addb4cbbcdd0ba49e8ca74926306125cdd7. Apr 17 02:51:35.689701 containerd[1591]: time="2026-04-17T02:51:35.689557789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qm55r,Uid:69a524db-b538-4c22-964f-7f978931fbb8,Namespace:kube-system,Attempt:0,} returns sandbox id \"aeea2a0df949f00ae4bade436ddf0addb4cbbcdd0ba49e8ca74926306125cdd7\"" Apr 17 02:51:35.693432 kubelet[2815]: E0417 02:51:35.693373 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:35.701445 containerd[1591]: time="2026-04-17T02:51:35.701388410Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-mzlpz,Uid:9cc323f4-390a-445b-8689-5d583ecb3f58,Namespace:tigera-operator,Attempt:0,}" Apr 17 02:51:35.709784 containerd[1591]: time="2026-04-17T02:51:35.709657081Z" level=info msg="CreateContainer within sandbox \"aeea2a0df949f00ae4bade436ddf0addb4cbbcdd0ba49e8ca74926306125cdd7\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Apr 17 02:51:35.762394 containerd[1591]: time="2026-04-17T02:51:35.762294550Z" level=info msg="Container 14e373a6470e868db7c912aea094d0912af5c2d59009a662c1f4632012bef17c: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:51:35.778565 containerd[1591]: time="2026-04-17T02:51:35.777686107Z" level=info msg="connecting to shim d26af431b516b6454db73520e2f5db7b95217135bbc5f81553184ea0392e7514" address="unix:///run/containerd/s/81e8043efa8889160cb56e8e56437fb88b677cc1b2b1be1b19899848bcc9cc5c" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:51:35.845818 containerd[1591]: time="2026-04-17T02:51:35.845748307Z" level=info msg="CreateContainer within sandbox \"aeea2a0df949f00ae4bade436ddf0addb4cbbcdd0ba49e8ca74926306125cdd7\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"14e373a6470e868db7c912aea094d0912af5c2d59009a662c1f4632012bef17c\"" Apr 17 02:51:35.853828 containerd[1591]: time="2026-04-17T02:51:35.851326284Z" level=info msg="StartContainer for \"14e373a6470e868db7c912aea094d0912af5c2d59009a662c1f4632012bef17c\"" Apr 17 02:51:35.857882 containerd[1591]: time="2026-04-17T02:51:35.857796151Z" level=info msg="connecting to shim 14e373a6470e868db7c912aea094d0912af5c2d59009a662c1f4632012bef17c" address="unix:///run/containerd/s/09f9a30f60d5ce91acd5c0da7407c674c1c03a4ea913d20eea834543beeea5e4" protocol=ttrpc version=3 Apr 17 02:51:35.916400 systemd[1]: Started cri-containerd-d26af431b516b6454db73520e2f5db7b95217135bbc5f81553184ea0392e7514.scope - libcontainer container d26af431b516b6454db73520e2f5db7b95217135bbc5f81553184ea0392e7514. Apr 17 02:51:35.929299 systemd[1]: Started cri-containerd-14e373a6470e868db7c912aea094d0912af5c2d59009a662c1f4632012bef17c.scope - libcontainer container 14e373a6470e868db7c912aea094d0912af5c2d59009a662c1f4632012bef17c. Apr 17 02:51:36.258340 containerd[1591]: time="2026-04-17T02:51:36.258269928Z" level=info msg="StartContainer for \"14e373a6470e868db7c912aea094d0912af5c2d59009a662c1f4632012bef17c\" returns successfully" Apr 17 02:51:36.290840 containerd[1591]: time="2026-04-17T02:51:36.289138820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-mzlpz,Uid:9cc323f4-390a-445b-8689-5d583ecb3f58,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"d26af431b516b6454db73520e2f5db7b95217135bbc5f81553184ea0392e7514\"" Apr 17 02:51:36.317849 containerd[1591]: time="2026-04-17T02:51:36.317763235Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Apr 17 02:51:36.440740 kubelet[2815]: E0417 02:51:36.439683 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:38.391254 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount131967484.mount: Deactivated successfully. Apr 17 02:51:40.283840 kubelet[2815]: I0417 02:51:40.283442 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-qm55r" podStartSLOduration=6.283409951 podStartE2EDuration="6.283409951s" podCreationTimestamp="2026-04-17 02:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 02:51:36.517875785 +0000 UTC m=+6.667822899" watchObservedRunningTime="2026-04-17 02:51:40.283409951 +0000 UTC m=+10.433357066" Apr 17 02:51:40.494496 kubelet[2815]: E0417 02:51:40.494328 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:40.664962 kubelet[2815]: E0417 02:51:40.664277 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:51:41.642858 containerd[1591]: time="2026-04-17T02:51:41.642771877Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:41.643857 containerd[1591]: time="2026-04-17T02:51:41.643825136Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Apr 17 02:51:41.648649 containerd[1591]: time="2026-04-17T02:51:41.648542548Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:41.653763 containerd[1591]: time="2026-04-17T02:51:41.653648202Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:51:41.654923 containerd[1591]: time="2026-04-17T02:51:41.654850133Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 5.337033356s" Apr 17 02:51:41.654923 containerd[1591]: time="2026-04-17T02:51:41.654905119Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Apr 17 02:51:41.668154 containerd[1591]: time="2026-04-17T02:51:41.668094106Z" level=info msg="CreateContainer within sandbox \"d26af431b516b6454db73520e2f5db7b95217135bbc5f81553184ea0392e7514\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Apr 17 02:51:41.698792 containerd[1591]: time="2026-04-17T02:51:41.696887126Z" level=info msg="Container f8696890ae0bab14f2ee30a53c344df75b5bbb8795fc62b28f67780f3655bc52: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:51:41.742236 containerd[1591]: time="2026-04-17T02:51:41.742151455Z" level=info msg="CreateContainer within sandbox \"d26af431b516b6454db73520e2f5db7b95217135bbc5f81553184ea0392e7514\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"f8696890ae0bab14f2ee30a53c344df75b5bbb8795fc62b28f67780f3655bc52\"" Apr 17 02:51:41.743642 containerd[1591]: time="2026-04-17T02:51:41.743587272Z" level=info msg="StartContainer for \"f8696890ae0bab14f2ee30a53c344df75b5bbb8795fc62b28f67780f3655bc52\"" Apr 17 02:51:41.744900 containerd[1591]: time="2026-04-17T02:51:41.744851726Z" level=info msg="connecting to shim f8696890ae0bab14f2ee30a53c344df75b5bbb8795fc62b28f67780f3655bc52" address="unix:///run/containerd/s/81e8043efa8889160cb56e8e56437fb88b677cc1b2b1be1b19899848bcc9cc5c" protocol=ttrpc version=3 Apr 17 02:51:41.787042 systemd[1]: Started cri-containerd-f8696890ae0bab14f2ee30a53c344df75b5bbb8795fc62b28f67780f3655bc52.scope - libcontainer container f8696890ae0bab14f2ee30a53c344df75b5bbb8795fc62b28f67780f3655bc52. Apr 17 02:51:41.939215 containerd[1591]: time="2026-04-17T02:51:41.938683198Z" level=info msg="StartContainer for \"f8696890ae0bab14f2ee30a53c344df75b5bbb8795fc62b28f67780f3655bc52\" returns successfully" Apr 17 02:51:42.767583 kubelet[2815]: I0417 02:51:42.767102 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-mzlpz" podStartSLOduration=2.411912963 podStartE2EDuration="7.767083942s" podCreationTimestamp="2026-04-17 02:51:35 +0000 UTC" firstStartedPulling="2026-04-17 02:51:36.303902602 +0000 UTC m=+6.453849708" lastFinishedPulling="2026-04-17 02:51:41.65907358 +0000 UTC m=+11.809020687" observedRunningTime="2026-04-17 02:51:42.76688673 +0000 UTC m=+12.916833836" watchObservedRunningTime="2026-04-17 02:51:42.767083942 +0000 UTC m=+12.917031056" Apr 17 02:51:55.942939 sudo[1820]: pam_unix(sudo:session): session closed for user root Apr 17 02:51:55.946116 sshd[1819]: Connection closed by 10.0.0.1 port 36394 Apr 17 02:51:55.949589 sshd-session[1814]: pam_unix(sshd:session): session closed for user core Apr 17 02:51:55.963463 systemd[1]: sshd@8-10.0.0.21:22-10.0.0.1:36394.service: Deactivated successfully. Apr 17 02:51:55.971632 systemd[1]: session-9.scope: Deactivated successfully. Apr 17 02:51:55.972244 systemd[1]: session-9.scope: Consumed 11.445s CPU time, 231.8M memory peak. Apr 17 02:51:55.974952 systemd-logind[1579]: Session 9 logged out. Waiting for processes to exit. Apr 17 02:51:55.982482 systemd-logind[1579]: Removed session 9. Apr 17 02:52:17.775237 systemd[1]: Created slice kubepods-besteffort-pod88f2371b_fdaa_484b_8502_82121dacd3e2.slice - libcontainer container kubepods-besteffort-pod88f2371b_fdaa_484b_8502_82121dacd3e2.slice. Apr 17 02:52:17.806807 kubelet[2815]: I0417 02:52:17.806232 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/88f2371b-fdaa-484b-8502-82121dacd3e2-typha-certs\") pod \"calico-typha-5d6b4dc567-4fkk9\" (UID: \"88f2371b-fdaa-484b-8502-82121dacd3e2\") " pod="calico-system/calico-typha-5d6b4dc567-4fkk9" Apr 17 02:52:17.814803 kubelet[2815]: I0417 02:52:17.814215 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpnv4\" (UniqueName: \"kubernetes.io/projected/88f2371b-fdaa-484b-8502-82121dacd3e2-kube-api-access-wpnv4\") pod \"calico-typha-5d6b4dc567-4fkk9\" (UID: \"88f2371b-fdaa-484b-8502-82121dacd3e2\") " pod="calico-system/calico-typha-5d6b4dc567-4fkk9" Apr 17 02:52:17.823903 kubelet[2815]: I0417 02:52:17.819602 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88f2371b-fdaa-484b-8502-82121dacd3e2-tigera-ca-bundle\") pod \"calico-typha-5d6b4dc567-4fkk9\" (UID: \"88f2371b-fdaa-484b-8502-82121dacd3e2\") " pod="calico-system/calico-typha-5d6b4dc567-4fkk9" Apr 17 02:52:18.434101 kubelet[2815]: E0417 02:52:18.433186 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:52:18.449672 containerd[1591]: time="2026-04-17T02:52:18.449561090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d6b4dc567-4fkk9,Uid:88f2371b-fdaa-484b-8502-82121dacd3e2,Namespace:calico-system,Attempt:0,}" Apr 17 02:52:18.669616 containerd[1591]: time="2026-04-17T02:52:18.669491102Z" level=info msg="connecting to shim 97c4d1dc104e340a01504a628a7ab8b669f00fcc78e14419699a72e4bc8c3128" address="unix:///run/containerd/s/66bcf2930c62a1ed6b8ced46c7b2de04083098784ad04aefbead538725656c57" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:52:18.970440 systemd[1]: Started cri-containerd-97c4d1dc104e340a01504a628a7ab8b669f00fcc78e14419699a72e4bc8c3128.scope - libcontainer container 97c4d1dc104e340a01504a628a7ab8b669f00fcc78e14419699a72e4bc8c3128. Apr 17 02:52:19.255795 containerd[1591]: time="2026-04-17T02:52:19.255577600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d6b4dc567-4fkk9,Uid:88f2371b-fdaa-484b-8502-82121dacd3e2,Namespace:calico-system,Attempt:0,} returns sandbox id \"97c4d1dc104e340a01504a628a7ab8b669f00fcc78e14419699a72e4bc8c3128\"" Apr 17 02:52:19.264672 kubelet[2815]: E0417 02:52:19.264189 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:52:19.285786 containerd[1591]: time="2026-04-17T02:52:19.285654921Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Apr 17 02:52:19.679618 systemd[1]: Created slice kubepods-besteffort-pod3cd1c395_e2c1_447e_a1fd_070045e276b2.slice - libcontainer container kubepods-besteffort-pod3cd1c395_e2c1_447e_a1fd_070045e276b2.slice. Apr 17 02:52:19.787240 kubelet[2815]: I0417 02:52:19.786792 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3cd1c395-e2c1-447e-a1fd-070045e276b2-node-certs\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.787240 kubelet[2815]: I0417 02:52:19.787080 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-bpffs\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.787240 kubelet[2815]: I0417 02:52:19.787105 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-cni-bin-dir\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.787240 kubelet[2815]: I0417 02:52:19.787124 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-lib-modules\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.787240 kubelet[2815]: I0417 02:52:19.787147 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-xtables-lock\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.788174 kubelet[2815]: I0417 02:52:19.787168 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-cni-log-dir\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.788174 kubelet[2815]: I0417 02:52:19.787188 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-flexvol-driver-host\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.788174 kubelet[2815]: I0417 02:52:19.787255 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cd1c395-e2c1-447e-a1fd-070045e276b2-tigera-ca-bundle\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.788174 kubelet[2815]: I0417 02:52:19.787334 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-var-run-calico\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.788174 kubelet[2815]: I0417 02:52:19.787395 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-cni-net-dir\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.788395 kubelet[2815]: I0417 02:52:19.787463 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-nodeproc\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.788395 kubelet[2815]: I0417 02:52:19.787485 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r46vd\" (UniqueName: \"kubernetes.io/projected/3cd1c395-e2c1-447e-a1fd-070045e276b2-kube-api-access-r46vd\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.788395 kubelet[2815]: I0417 02:52:19.787512 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-policysync\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.788395 kubelet[2815]: I0417 02:52:19.787538 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-sys-fs\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.788395 kubelet[2815]: I0417 02:52:19.787556 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3cd1c395-e2c1-447e-a1fd-070045e276b2-var-lib-calico\") pod \"calico-node-57skb\" (UID: \"3cd1c395-e2c1-447e-a1fd-070045e276b2\") " pod="calico-system/calico-node-57skb" Apr 17 02:52:19.927226 kubelet[2815]: E0417 02:52:19.926704 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:19.927226 kubelet[2815]: W0417 02:52:19.926792 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:19.927226 kubelet[2815]: E0417 02:52:19.926858 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:19.927226 kubelet[2815]: E0417 02:52:19.927038 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:19.927226 kubelet[2815]: W0417 02:52:19.927046 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:19.927226 kubelet[2815]: E0417 02:52:19.927058 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:19.940513 kubelet[2815]: E0417 02:52:19.927325 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:19.940513 kubelet[2815]: W0417 02:52:19.927334 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:19.940513 kubelet[2815]: E0417 02:52:19.927895 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:19.940513 kubelet[2815]: E0417 02:52:19.938911 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:19.940513 kubelet[2815]: W0417 02:52:19.939162 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:19.940513 kubelet[2815]: E0417 02:52:19.939309 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:19.947131 kubelet[2815]: E0417 02:52:19.943444 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:19.947131 kubelet[2815]: W0417 02:52:19.943469 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:19.947131 kubelet[2815]: E0417 02:52:19.945039 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:19.950323 kubelet[2815]: E0417 02:52:19.949284 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:19.950323 kubelet[2815]: W0417 02:52:19.949438 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:19.950323 kubelet[2815]: E0417 02:52:19.950217 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:19.962309 kubelet[2815]: E0417 02:52:19.960290 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:19.963802 kubelet[2815]: W0417 02:52:19.963599 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:19.967529 kubelet[2815]: E0417 02:52:19.967337 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:19.993231 kubelet[2815]: E0417 02:52:19.993159 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:19.995199 kubelet[2815]: W0417 02:52:19.994183 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:19.995199 kubelet[2815]: E0417 02:52:19.994442 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:19.997108 kubelet[2815]: E0417 02:52:19.996856 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:19.997108 kubelet[2815]: W0417 02:52:19.996882 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:19.997108 kubelet[2815]: E0417 02:52:19.996959 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:19.997786 kubelet[2815]: E0417 02:52:19.997703 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:19.997910 kubelet[2815]: W0417 02:52:19.997868 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:19.997910 kubelet[2815]: E0417 02:52:19.997884 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:19.999786 kubelet[2815]: E0417 02:52:19.999503 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.010292 kubelet[2815]: W0417 02:52:19.999952 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.013226 kubelet[2815]: E0417 02:52:20.013093 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.014606 kubelet[2815]: E0417 02:52:20.014521 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.014784 kubelet[2815]: W0417 02:52:20.014592 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.014784 kubelet[2815]: E0417 02:52:20.014695 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.017363 kubelet[2815]: E0417 02:52:20.017199 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.018065 kubelet[2815]: W0417 02:52:20.017317 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.018337 kubelet[2815]: E0417 02:52:20.018278 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.024650 kubelet[2815]: E0417 02:52:20.024410 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.027184 kubelet[2815]: W0417 02:52:20.026957 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.027184 kubelet[2815]: E0417 02:52:20.027182 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.027674 kubelet[2815]: E0417 02:52:20.027651 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.027782 kubelet[2815]: W0417 02:52:20.027674 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.027782 kubelet[2815]: E0417 02:52:20.027687 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.027988 kubelet[2815]: E0417 02:52:20.027971 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.028007 kubelet[2815]: W0417 02:52:20.027991 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.028007 kubelet[2815]: E0417 02:52:20.028002 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.028388 kubelet[2815]: E0417 02:52:20.028365 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.028388 kubelet[2815]: W0417 02:52:20.028383 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.028453 kubelet[2815]: E0417 02:52:20.028393 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.028755 kubelet[2815]: E0417 02:52:20.028698 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.028755 kubelet[2815]: W0417 02:52:20.028751 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.028808 kubelet[2815]: E0417 02:52:20.028763 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.029093 kubelet[2815]: E0417 02:52:20.029048 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.030090 kubelet[2815]: W0417 02:52:20.029916 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.030090 kubelet[2815]: E0417 02:52:20.029936 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.032397 kubelet[2815]: E0417 02:52:20.031010 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.033199 kubelet[2815]: W0417 02:52:20.032910 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.033199 kubelet[2815]: E0417 02:52:20.033169 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.043402 kubelet[2815]: E0417 02:52:20.042579 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.044242 kubelet[2815]: W0417 02:52:20.043507 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.044242 kubelet[2815]: E0417 02:52:20.043659 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.048776 kubelet[2815]: E0417 02:52:20.048608 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.049133 kubelet[2815]: W0417 02:52:20.048694 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.049133 kubelet[2815]: E0417 02:52:20.048941 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.050945 kubelet[2815]: E0417 02:52:20.050523 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.052075 kubelet[2815]: W0417 02:52:20.050607 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.052075 kubelet[2815]: E0417 02:52:20.051300 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.090697 kubelet[2815]: E0417 02:52:20.084844 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.090697 kubelet[2815]: W0417 02:52:20.084915 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.090697 kubelet[2815]: E0417 02:52:20.084994 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.090697 kubelet[2815]: E0417 02:52:20.086686 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.090697 kubelet[2815]: W0417 02:52:20.086778 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.090697 kubelet[2815]: E0417 02:52:20.086843 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.090697 kubelet[2815]: E0417 02:52:20.087180 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.090697 kubelet[2815]: W0417 02:52:20.087188 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.090697 kubelet[2815]: E0417 02:52:20.087198 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.090697 kubelet[2815]: E0417 02:52:20.087486 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.098847 kubelet[2815]: W0417 02:52:20.087494 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.098847 kubelet[2815]: E0417 02:52:20.087503 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.098847 kubelet[2815]: E0417 02:52:20.089094 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.098847 kubelet[2815]: W0417 02:52:20.089142 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.098847 kubelet[2815]: E0417 02:52:20.089217 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.098847 kubelet[2815]: E0417 02:52:20.092115 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.098847 kubelet[2815]: W0417 02:52:20.092169 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.098847 kubelet[2815]: E0417 02:52:20.092270 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.098847 kubelet[2815]: E0417 02:52:20.093654 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.098847 kubelet[2815]: W0417 02:52:20.093778 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.099216 kubelet[2815]: E0417 02:52:20.093855 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.099216 kubelet[2815]: E0417 02:52:20.097469 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.099216 kubelet[2815]: W0417 02:52:20.097524 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.099216 kubelet[2815]: E0417 02:52:20.098100 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.099454 kubelet[2815]: E0417 02:52:20.099409 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.099488 kubelet[2815]: W0417 02:52:20.099436 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.099519 kubelet[2815]: E0417 02:52:20.099483 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.100005 kubelet[2815]: E0417 02:52:20.099840 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.101934 kubelet[2815]: W0417 02:52:20.100325 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.101934 kubelet[2815]: E0417 02:52:20.100490 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.101934 kubelet[2815]: E0417 02:52:20.101486 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.101934 kubelet[2815]: W0417 02:52:20.101495 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.101934 kubelet[2815]: E0417 02:52:20.101506 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.105186 kubelet[2815]: E0417 02:52:20.105057 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.105186 kubelet[2815]: W0417 02:52:20.105150 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.105606 kubelet[2815]: E0417 02:52:20.105226 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.108392 kubelet[2815]: E0417 02:52:20.108297 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.108392 kubelet[2815]: W0417 02:52:20.108374 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.109233 kubelet[2815]: E0417 02:52:20.108483 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.109943 kubelet[2815]: E0417 02:52:20.109827 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.109943 kubelet[2815]: W0417 02:52:20.109892 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.110034 kubelet[2815]: E0417 02:52:20.109954 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.110301 kubelet[2815]: E0417 02:52:20.110206 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.110301 kubelet[2815]: W0417 02:52:20.110231 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.110301 kubelet[2815]: E0417 02:52:20.110242 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.113583 kubelet[2815]: E0417 02:52:20.113284 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.115851 kubelet[2815]: W0417 02:52:20.115782 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.116429 kubelet[2815]: E0417 02:52:20.116210 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.120318 kubelet[2815]: E0417 02:52:20.120181 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.120318 kubelet[2815]: W0417 02:52:20.120276 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.120644 kubelet[2815]: E0417 02:52:20.120399 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.122520 kubelet[2815]: E0417 02:52:20.122373 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.122520 kubelet[2815]: W0417 02:52:20.122494 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.122844 kubelet[2815]: E0417 02:52:20.122629 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.123037 kubelet[2815]: E0417 02:52:20.122972 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.123037 kubelet[2815]: W0417 02:52:20.123005 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.123037 kubelet[2815]: E0417 02:52:20.123020 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.123238 kubelet[2815]: E0417 02:52:20.123141 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.123238 kubelet[2815]: W0417 02:52:20.123163 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.123238 kubelet[2815]: E0417 02:52:20.123171 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.123314 kubelet[2815]: E0417 02:52:20.123265 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.123314 kubelet[2815]: W0417 02:52:20.123271 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.123314 kubelet[2815]: E0417 02:52:20.123282 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.125243 kubelet[2815]: E0417 02:52:20.123423 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.125243 kubelet[2815]: W0417 02:52:20.123429 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.125243 kubelet[2815]: E0417 02:52:20.123438 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.125243 kubelet[2815]: E0417 02:52:20.125031 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.125243 kubelet[2815]: W0417 02:52:20.125097 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.125243 kubelet[2815]: E0417 02:52:20.125172 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.178134 kubelet[2815]: E0417 02:52:20.177900 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.178134 kubelet[2815]: W0417 02:52:20.178110 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.178465 kubelet[2815]: E0417 02:52:20.178224 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.185113 kubelet[2815]: E0417 02:52:20.182677 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.185740 kubelet[2815]: W0417 02:52:20.185184 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.190933 kubelet[2815]: E0417 02:52:20.185881 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.196142 kubelet[2815]: E0417 02:52:20.192233 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.196142 kubelet[2815]: W0417 02:52:20.194596 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.198186 kubelet[2815]: E0417 02:52:20.198105 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.234971 kubelet[2815]: E0417 02:52:20.199322 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.234971 kubelet[2815]: W0417 02:52:20.199341 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.234971 kubelet[2815]: E0417 02:52:20.199440 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.252144 kubelet[2815]: E0417 02:52:20.251970 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.252452 kubelet[2815]: W0417 02:52:20.252126 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.252452 kubelet[2815]: E0417 02:52:20.252226 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.252596 kubelet[2815]: E0417 02:52:20.252548 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.252596 kubelet[2815]: W0417 02:52:20.252576 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.252596 kubelet[2815]: E0417 02:52:20.252589 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.252747 kubelet[2815]: E0417 02:52:20.252687 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.252747 kubelet[2815]: W0417 02:52:20.252693 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.252747 kubelet[2815]: E0417 02:52:20.252704 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.253313 kubelet[2815]: E0417 02:52:20.252829 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.253313 kubelet[2815]: W0417 02:52:20.252838 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.253313 kubelet[2815]: E0417 02:52:20.252845 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.253313 kubelet[2815]: E0417 02:52:20.252938 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.253313 kubelet[2815]: W0417 02:52:20.252944 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.253313 kubelet[2815]: E0417 02:52:20.252950 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.253313 kubelet[2815]: E0417 02:52:20.253037 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.253313 kubelet[2815]: W0417 02:52:20.253043 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.253313 kubelet[2815]: E0417 02:52:20.253049 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.253313 kubelet[2815]: E0417 02:52:20.253122 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.253645 kubelet[2815]: W0417 02:52:20.253128 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.253645 kubelet[2815]: E0417 02:52:20.253133 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.253645 kubelet[2815]: E0417 02:52:20.253332 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.253645 kubelet[2815]: W0417 02:52:20.253338 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.253645 kubelet[2815]: E0417 02:52:20.253370 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.253645 kubelet[2815]: E0417 02:52:20.253493 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.253645 kubelet[2815]: W0417 02:52:20.253498 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.253645 kubelet[2815]: E0417 02:52:20.253506 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.253645 kubelet[2815]: E0417 02:52:20.253591 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.253645 kubelet[2815]: W0417 02:52:20.253597 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.256223 kubelet[2815]: E0417 02:52:20.253603 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.256223 kubelet[2815]: E0417 02:52:20.254184 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.256223 kubelet[2815]: W0417 02:52:20.254237 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.256223 kubelet[2815]: E0417 02:52:20.254374 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.256223 kubelet[2815]: E0417 02:52:20.256024 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.256223 kubelet[2815]: W0417 02:52:20.256040 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.256223 kubelet[2815]: E0417 02:52:20.256155 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.262947 kubelet[2815]: E0417 02:52:20.262877 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.262947 kubelet[2815]: W0417 02:52:20.262913 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.263991 kubelet[2815]: E0417 02:52:20.262972 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.266793 kubelet[2815]: E0417 02:52:20.263762 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.267245 kubelet[2815]: W0417 02:52:20.266975 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.267245 kubelet[2815]: E0417 02:52:20.267130 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.270024 kubelet[2815]: E0417 02:52:20.269954 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:20.270861 kubelet[2815]: E0417 02:52:20.270839 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.270861 kubelet[2815]: W0417 02:52:20.270853 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.270980 kubelet[2815]: E0417 02:52:20.270869 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.279255 kubelet[2815]: E0417 02:52:20.276994 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.279255 kubelet[2815]: W0417 02:52:20.277017 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.279255 kubelet[2815]: E0417 02:52:20.277121 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.352433 kubelet[2815]: E0417 02:52:20.349273 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.352433 kubelet[2815]: W0417 02:52:20.349332 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.352433 kubelet[2815]: E0417 02:52:20.351399 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.355966 kubelet[2815]: E0417 02:52:20.355805 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.356478 kubelet[2815]: W0417 02:52:20.356161 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.356852 kubelet[2815]: E0417 02:52:20.356834 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.357276 kubelet[2815]: E0417 02:52:20.357265 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.357401 kubelet[2815]: W0417 02:52:20.357390 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.357581 kubelet[2815]: E0417 02:52:20.357488 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.357984 kubelet[2815]: E0417 02:52:20.357974 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.358082 kubelet[2815]: W0417 02:52:20.358073 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.358188 kubelet[2815]: E0417 02:52:20.358178 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.359962 kubelet[2815]: E0417 02:52:20.359918 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.364099 kubelet[2815]: W0417 02:52:20.360073 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.367479 kubelet[2815]: E0417 02:52:20.366095 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.368414 kubelet[2815]: E0417 02:52:20.368395 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.369798 kubelet[2815]: W0417 02:52:20.368511 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.373138 kubelet[2815]: E0417 02:52:20.372943 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.375068 kubelet[2815]: E0417 02:52:20.374919 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.375438 kubelet[2815]: W0417 02:52:20.375393 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.375619 kubelet[2815]: E0417 02:52:20.375607 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.380452 kubelet[2815]: E0417 02:52:20.380081 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.381644 kubelet[2815]: W0417 02:52:20.381267 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.386763 kubelet[2815]: E0417 02:52:20.381494 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.428779 kubelet[2815]: E0417 02:52:20.428504 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.429509 kubelet[2815]: W0417 02:52:20.429377 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.429875 kubelet[2815]: E0417 02:52:20.429685 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.443017 kubelet[2815]: E0417 02:52:20.442086 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.443017 kubelet[2815]: W0417 02:52:20.442124 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.443017 kubelet[2815]: E0417 02:52:20.442224 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.451274 kubelet[2815]: E0417 02:52:20.451109 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.451274 kubelet[2815]: W0417 02:52:20.451236 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.451627 kubelet[2815]: E0417 02:52:20.451427 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.454228 kubelet[2815]: E0417 02:52:20.453661 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.454950 kubelet[2815]: W0417 02:52:20.454501 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.454950 kubelet[2815]: E0417 02:52:20.454680 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.465566 containerd[1591]: time="2026-04-17T02:52:20.465391027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-57skb,Uid:3cd1c395-e2c1-447e-a1fd-070045e276b2,Namespace:calico-system,Attempt:0,}" Apr 17 02:52:20.467751 kubelet[2815]: E0417 02:52:20.466300 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.467751 kubelet[2815]: W0417 02:52:20.466322 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.467751 kubelet[2815]: E0417 02:52:20.466466 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.468154 kubelet[2815]: E0417 02:52:20.468070 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.468154 kubelet[2815]: W0417 02:52:20.468083 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.468625 kubelet[2815]: E0417 02:52:20.468190 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.468996 kubelet[2815]: E0417 02:52:20.468962 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.468996 kubelet[2815]: W0417 02:52:20.468986 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.469059 kubelet[2815]: E0417 02:52:20.468999 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.470795 kubelet[2815]: E0417 02:52:20.470591 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.471800 kubelet[2815]: W0417 02:52:20.471188 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.472024 kubelet[2815]: E0417 02:52:20.471629 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.478830 kubelet[2815]: E0417 02:52:20.477989 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.478830 kubelet[2815]: W0417 02:52:20.478055 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.481776 kubelet[2815]: E0417 02:52:20.479748 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.481776 kubelet[2815]: I0417 02:52:20.479937 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2c15a0f-c771-44b2-be28-01bbe4cb1c8b-kubelet-dir\") pod \"csi-node-driver-vnr2q\" (UID: \"f2c15a0f-c771-44b2-be28-01bbe4cb1c8b\") " pod="calico-system/csi-node-driver-vnr2q" Apr 17 02:52:20.495988 kubelet[2815]: E0417 02:52:20.491704 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.495988 kubelet[2815]: W0417 02:52:20.491797 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.495988 kubelet[2815]: E0417 02:52:20.492011 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.495988 kubelet[2815]: I0417 02:52:20.494099 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f2c15a0f-c771-44b2-be28-01bbe4cb1c8b-registration-dir\") pod \"csi-node-driver-vnr2q\" (UID: \"f2c15a0f-c771-44b2-be28-01bbe4cb1c8b\") " pod="calico-system/csi-node-driver-vnr2q" Apr 17 02:52:20.566766 kubelet[2815]: E0417 02:52:20.559008 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.566766 kubelet[2815]: W0417 02:52:20.559485 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.566766 kubelet[2815]: E0417 02:52:20.560704 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.566766 kubelet[2815]: E0417 02:52:20.563044 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.566766 kubelet[2815]: W0417 02:52:20.563082 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.566766 kubelet[2815]: E0417 02:52:20.563169 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.568795 kubelet[2815]: E0417 02:52:20.568664 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.568795 kubelet[2815]: W0417 02:52:20.568740 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.569052 kubelet[2815]: E0417 02:52:20.568829 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.571616 kubelet[2815]: E0417 02:52:20.570862 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.571616 kubelet[2815]: W0417 02:52:20.570917 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.571616 kubelet[2815]: E0417 02:52:20.570974 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.576642 kubelet[2815]: I0417 02:52:20.576064 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f2c15a0f-c771-44b2-be28-01bbe4cb1c8b-socket-dir\") pod \"csi-node-driver-vnr2q\" (UID: \"f2c15a0f-c771-44b2-be28-01bbe4cb1c8b\") " pod="calico-system/csi-node-driver-vnr2q" Apr 17 02:52:20.578768 kubelet[2815]: E0417 02:52:20.578465 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.578987 kubelet[2815]: W0417 02:52:20.578752 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.578987 kubelet[2815]: E0417 02:52:20.578825 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.590085 kubelet[2815]: E0417 02:52:20.587317 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.590085 kubelet[2815]: W0417 02:52:20.587894 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.590085 kubelet[2815]: E0417 02:52:20.588121 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.600215 kubelet[2815]: E0417 02:52:20.598373 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.600215 kubelet[2815]: W0417 02:52:20.598416 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.600215 kubelet[2815]: E0417 02:52:20.598506 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.605989 kubelet[2815]: E0417 02:52:20.605910 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.606470 kubelet[2815]: W0417 02:52:20.605971 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.606470 kubelet[2815]: E0417 02:52:20.606039 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.609612 kubelet[2815]: E0417 02:52:20.609531 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.609612 kubelet[2815]: W0417 02:52:20.609573 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.610222 kubelet[2815]: E0417 02:52:20.609606 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.610695 kubelet[2815]: E0417 02:52:20.610486 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.610695 kubelet[2815]: W0417 02:52:20.610580 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.610695 kubelet[2815]: E0417 02:52:20.610596 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.612113 kubelet[2815]: E0417 02:52:20.612053 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.612113 kubelet[2815]: W0417 02:52:20.612105 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.612283 kubelet[2815]: E0417 02:52:20.612194 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.615809 kubelet[2815]: E0417 02:52:20.614596 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.615809 kubelet[2815]: W0417 02:52:20.614616 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.615809 kubelet[2815]: E0417 02:52:20.614633 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.615809 kubelet[2815]: I0417 02:52:20.614658 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f2c15a0f-c771-44b2-be28-01bbe4cb1c8b-varrun\") pod \"csi-node-driver-vnr2q\" (UID: \"f2c15a0f-c771-44b2-be28-01bbe4cb1c8b\") " pod="calico-system/csi-node-driver-vnr2q" Apr 17 02:52:20.618314 kubelet[2815]: E0417 02:52:20.618258 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.618620 kubelet[2815]: W0417 02:52:20.618292 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.618620 kubelet[2815]: E0417 02:52:20.618391 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.618620 kubelet[2815]: E0417 02:52:20.618608 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.618620 kubelet[2815]: W0417 02:52:20.618618 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.618773 kubelet[2815]: E0417 02:52:20.618630 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.622771 containerd[1591]: time="2026-04-17T02:52:20.620445969Z" level=info msg="connecting to shim 596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58" address="unix:///run/containerd/s/8503a913e4bd2dd78686e347d2cc5aab1d98c77fb72551513649a5a73f2273b8" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:52:20.771394 kubelet[2815]: E0417 02:52:20.762660 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.771394 kubelet[2815]: W0417 02:52:20.762678 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.771394 kubelet[2815]: E0417 02:52:20.762806 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.773448 kubelet[2815]: E0417 02:52:20.772605 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.773448 kubelet[2815]: W0417 02:52:20.772627 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.775865 kubelet[2815]: E0417 02:52:20.774274 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.775865 kubelet[2815]: I0417 02:52:20.775108 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xrhd\" (UniqueName: \"kubernetes.io/projected/f2c15a0f-c771-44b2-be28-01bbe4cb1c8b-kube-api-access-7xrhd\") pod \"csi-node-driver-vnr2q\" (UID: \"f2c15a0f-c771-44b2-be28-01bbe4cb1c8b\") " pod="calico-system/csi-node-driver-vnr2q" Apr 17 02:52:20.783531 kubelet[2815]: E0417 02:52:20.782826 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.786706 kubelet[2815]: W0417 02:52:20.783793 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.790255 kubelet[2815]: E0417 02:52:20.790223 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.793264 kubelet[2815]: E0417 02:52:20.793175 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.793897 kubelet[2815]: W0417 02:52:20.793194 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.795081 kubelet[2815]: E0417 02:52:20.793575 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.801750 kubelet[2815]: E0417 02:52:20.801484 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.801750 kubelet[2815]: W0417 02:52:20.801561 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.802335 kubelet[2815]: E0417 02:52:20.801651 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.805923 kubelet[2815]: E0417 02:52:20.805774 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.832827 kubelet[2815]: W0417 02:52:20.824703 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.834764 kubelet[2815]: E0417 02:52:20.834440 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.840611 systemd[1]: Started cri-containerd-596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58.scope - libcontainer container 596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58. Apr 17 02:52:20.842488 kubelet[2815]: E0417 02:52:20.841936 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.842488 kubelet[2815]: W0417 02:52:20.841954 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.842488 kubelet[2815]: E0417 02:52:20.842019 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.842488 kubelet[2815]: E0417 02:52:20.842297 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.842488 kubelet[2815]: W0417 02:52:20.842308 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.842488 kubelet[2815]: E0417 02:52:20.842398 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.842757 kubelet[2815]: E0417 02:52:20.842662 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.842757 kubelet[2815]: W0417 02:52:20.842671 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.842757 kubelet[2815]: E0417 02:52:20.842682 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.844242 kubelet[2815]: E0417 02:52:20.843330 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.844242 kubelet[2815]: W0417 02:52:20.843344 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.844543 kubelet[2815]: E0417 02:52:20.844388 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.850514 kubelet[2815]: E0417 02:52:20.850378 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.850809 kubelet[2815]: W0417 02:52:20.850502 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.850809 kubelet[2815]: E0417 02:52:20.850599 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.853227 kubelet[2815]: E0417 02:52:20.853029 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.854504 kubelet[2815]: W0417 02:52:20.853513 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.854636 kubelet[2815]: E0417 02:52:20.854446 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.858313 kubelet[2815]: E0417 02:52:20.858202 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.861145 kubelet[2815]: W0417 02:52:20.858284 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.861145 kubelet[2815]: E0417 02:52:20.858438 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.873670 kubelet[2815]: E0417 02:52:20.870546 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.873670 kubelet[2815]: W0417 02:52:20.870743 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.873670 kubelet[2815]: E0417 02:52:20.870895 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.882686 kubelet[2815]: E0417 02:52:20.881875 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.890120 kubelet[2815]: W0417 02:52:20.889694 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.893177 kubelet[2815]: E0417 02:52:20.890990 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.957260 kubelet[2815]: E0417 02:52:20.956879 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.957260 kubelet[2815]: W0417 02:52:20.957052 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.957260 kubelet[2815]: E0417 02:52:20.957194 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.963379 kubelet[2815]: E0417 02:52:20.962981 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.963379 kubelet[2815]: W0417 02:52:20.963120 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.965815 kubelet[2815]: E0417 02:52:20.965143 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.972224 kubelet[2815]: E0417 02:52:20.971142 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.973389 kubelet[2815]: W0417 02:52:20.972965 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.973389 kubelet[2815]: E0417 02:52:20.973170 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.977761 kubelet[2815]: E0417 02:52:20.976847 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.977761 kubelet[2815]: W0417 02:52:20.976904 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.977761 kubelet[2815]: E0417 02:52:20.976969 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.977761 kubelet[2815]: E0417 02:52:20.977862 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.977761 kubelet[2815]: W0417 02:52:20.977875 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.977761 kubelet[2815]: E0417 02:52:20.977891 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:20.987772 kubelet[2815]: E0417 02:52:20.987069 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:20.987772 kubelet[2815]: W0417 02:52:20.987254 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:20.987772 kubelet[2815]: E0417 02:52:20.987393 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:21.008983 kubelet[2815]: E0417 02:52:21.005605 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:21.011820 kubelet[2815]: W0417 02:52:21.010443 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:21.027749 kubelet[2815]: E0417 02:52:21.021664 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:21.040537 kubelet[2815]: E0417 02:52:21.039450 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:21.040537 kubelet[2815]: W0417 02:52:21.040207 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:21.040537 kubelet[2815]: E0417 02:52:21.040427 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:21.045191 kubelet[2815]: E0417 02:52:21.045019 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:21.045191 kubelet[2815]: W0417 02:52:21.045048 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:21.048862 kubelet[2815]: E0417 02:52:21.045985 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:21.048862 kubelet[2815]: E0417 02:52:21.048533 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:21.048862 kubelet[2815]: W0417 02:52:21.048547 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:21.048862 kubelet[2815]: E0417 02:52:21.048606 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:21.049833 kubelet[2815]: E0417 02:52:21.049076 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:21.049833 kubelet[2815]: W0417 02:52:21.049093 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:21.049833 kubelet[2815]: E0417 02:52:21.049158 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:21.050808 kubelet[2815]: E0417 02:52:21.050700 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:21.051431 kubelet[2815]: W0417 02:52:21.051283 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:21.051678 kubelet[2815]: E0417 02:52:21.051577 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:21.052326 kubelet[2815]: E0417 02:52:21.052316 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:21.052490 kubelet[2815]: W0417 02:52:21.052398 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:21.052490 kubelet[2815]: E0417 02:52:21.052410 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:21.144519 kubelet[2815]: E0417 02:52:21.144265 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:21.144519 kubelet[2815]: W0417 02:52:21.144340 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:21.144519 kubelet[2815]: E0417 02:52:21.144422 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:21.150888 containerd[1591]: time="2026-04-17T02:52:21.150835459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-57skb,Uid:3cd1c395-e2c1-447e-a1fd-070045e276b2,Namespace:calico-system,Attempt:0,} returns sandbox id \"596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58\"" Apr 17 02:52:22.038762 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2465059910.mount: Deactivated successfully. Apr 17 02:52:22.172429 kubelet[2815]: E0417 02:52:22.172113 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:24.169346 kubelet[2815]: E0417 02:52:24.169065 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:24.412878 containerd[1591]: time="2026-04-17T02:52:24.412087073Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:24.432684 containerd[1591]: time="2026-04-17T02:52:24.420322995Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Apr 17 02:52:24.450550 containerd[1591]: time="2026-04-17T02:52:24.449974956Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:24.458971 containerd[1591]: time="2026-04-17T02:52:24.458877498Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:24.462372 containerd[1591]: time="2026-04-17T02:52:24.462289688Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 5.17648734s" Apr 17 02:52:24.462372 containerd[1591]: time="2026-04-17T02:52:24.462351954Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Apr 17 02:52:24.474387 containerd[1591]: time="2026-04-17T02:52:24.469070295Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Apr 17 02:52:24.580115 containerd[1591]: time="2026-04-17T02:52:24.577867479Z" level=info msg="CreateContainer within sandbox \"97c4d1dc104e340a01504a628a7ab8b669f00fcc78e14419699a72e4bc8c3128\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Apr 17 02:52:24.632836 containerd[1591]: time="2026-04-17T02:52:24.631874072Z" level=info msg="Container 51388955b0b26766fb6f2f0d4320423050b5c5a6d2fcfac9c4e50ebdf3f65345: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:52:24.670564 containerd[1591]: time="2026-04-17T02:52:24.670504867Z" level=info msg="CreateContainer within sandbox \"97c4d1dc104e340a01504a628a7ab8b669f00fcc78e14419699a72e4bc8c3128\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"51388955b0b26766fb6f2f0d4320423050b5c5a6d2fcfac9c4e50ebdf3f65345\"" Apr 17 02:52:24.672959 containerd[1591]: time="2026-04-17T02:52:24.672890722Z" level=info msg="StartContainer for \"51388955b0b26766fb6f2f0d4320423050b5c5a6d2fcfac9c4e50ebdf3f65345\"" Apr 17 02:52:24.674604 containerd[1591]: time="2026-04-17T02:52:24.674563379Z" level=info msg="connecting to shim 51388955b0b26766fb6f2f0d4320423050b5c5a6d2fcfac9c4e50ebdf3f65345" address="unix:///run/containerd/s/66bcf2930c62a1ed6b8ced46c7b2de04083098784ad04aefbead538725656c57" protocol=ttrpc version=3 Apr 17 02:52:24.745218 systemd[1]: Started cri-containerd-51388955b0b26766fb6f2f0d4320423050b5c5a6d2fcfac9c4e50ebdf3f65345.scope - libcontainer container 51388955b0b26766fb6f2f0d4320423050b5c5a6d2fcfac9c4e50ebdf3f65345. Apr 17 02:52:24.966575 containerd[1591]: time="2026-04-17T02:52:24.966424308Z" level=info msg="StartContainer for \"51388955b0b26766fb6f2f0d4320423050b5c5a6d2fcfac9c4e50ebdf3f65345\" returns successfully" Apr 17 02:52:25.851782 kubelet[2815]: E0417 02:52:25.851564 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:52:25.960771 kubelet[2815]: E0417 02:52:25.959357 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:25.960771 kubelet[2815]: W0417 02:52:25.960168 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:25.960771 kubelet[2815]: E0417 02:52:25.960672 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:25.965038 kubelet[2815]: E0417 02:52:25.964657 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:25.965757 kubelet[2815]: W0417 02:52:25.965373 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:25.965757 kubelet[2815]: E0417 02:52:25.965597 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:25.971357 kubelet[2815]: E0417 02:52:25.970968 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:25.971964 kubelet[2815]: W0417 02:52:25.971508 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:25.974647 kubelet[2815]: E0417 02:52:25.971669 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:25.977133 kubelet[2815]: E0417 02:52:25.975164 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:25.977931 kubelet[2815]: W0417 02:52:25.977539 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.038841 kubelet[2815]: E0417 02:52:25.983130 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.039559 kubelet[2815]: E0417 02:52:26.039448 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.039642 kubelet[2815]: W0417 02:52:26.039555 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.043131 kubelet[2815]: E0417 02:52:26.042887 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.044885 kubelet[2815]: E0417 02:52:26.044197 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.045196 kubelet[2815]: W0417 02:52:26.044922 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.047513 kubelet[2815]: E0417 02:52:26.045103 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.047513 kubelet[2815]: E0417 02:52:26.046982 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.047513 kubelet[2815]: W0417 02:52:26.047063 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.047513 kubelet[2815]: E0417 02:52:26.047175 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.050845 kubelet[2815]: E0417 02:52:26.050751 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.051041 kubelet[2815]: W0417 02:52:26.050821 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.051171 kubelet[2815]: E0417 02:52:26.050928 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.056223 kubelet[2815]: E0417 02:52:26.056008 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.056223 kubelet[2815]: W0417 02:52:26.056190 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.057335 kubelet[2815]: E0417 02:52:26.056402 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.077434 kubelet[2815]: E0417 02:52:26.065403 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.081181 kubelet[2815]: W0417 02:52:26.078133 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.081682 kubelet[2815]: E0417 02:52:26.080985 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.088891 kubelet[2815]: I0417 02:52:26.088021 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5d6b4dc567-4fkk9" podStartSLOduration=3.896481874 podStartE2EDuration="9.087996996s" podCreationTimestamp="2026-04-17 02:52:17 +0000 UTC" firstStartedPulling="2026-04-17 02:52:19.274234808 +0000 UTC m=+49.424181912" lastFinishedPulling="2026-04-17 02:52:24.465749921 +0000 UTC m=+54.615697034" observedRunningTime="2026-04-17 02:52:26.076331165 +0000 UTC m=+56.226278280" watchObservedRunningTime="2026-04-17 02:52:26.087996996 +0000 UTC m=+56.237944117" Apr 17 02:52:26.102811 kubelet[2815]: E0417 02:52:26.099436 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.105842 kubelet[2815]: W0417 02:52:26.103559 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.105842 kubelet[2815]: E0417 02:52:26.105369 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.114349 kubelet[2815]: E0417 02:52:26.114234 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.116758 kubelet[2815]: W0417 02:52:26.114325 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.118792 kubelet[2815]: E0417 02:52:26.117516 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.125562 kubelet[2815]: E0417 02:52:26.125034 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.134496 kubelet[2815]: W0417 02:52:26.128690 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.136980 kubelet[2815]: E0417 02:52:26.135588 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.139860 kubelet[2815]: E0417 02:52:26.139269 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.140619 kubelet[2815]: W0417 02:52:26.139932 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.140619 kubelet[2815]: E0417 02:52:26.140036 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.143269 kubelet[2815]: E0417 02:52:26.143004 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.145830 kubelet[2815]: W0417 02:52:26.144680 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.146070 kubelet[2815]: E0417 02:52:26.145953 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.153149 kubelet[2815]: E0417 02:52:26.153038 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:26.155474 kubelet[2815]: E0417 02:52:26.155073 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.155757 kubelet[2815]: W0417 02:52:26.155492 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.155757 kubelet[2815]: E0417 02:52:26.155560 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.161262 kubelet[2815]: E0417 02:52:26.161166 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.161262 kubelet[2815]: W0417 02:52:26.161230 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.161925 kubelet[2815]: E0417 02:52:26.161336 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.163058 kubelet[2815]: E0417 02:52:26.162362 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.163232 kubelet[2815]: W0417 02:52:26.163091 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.163328 kubelet[2815]: E0417 02:52:26.163264 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.167263 kubelet[2815]: E0417 02:52:26.167041 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.169107 kubelet[2815]: W0417 02:52:26.168013 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.169107 kubelet[2815]: E0417 02:52:26.168457 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.171199 kubelet[2815]: E0417 02:52:26.171020 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.172222 kubelet[2815]: W0417 02:52:26.171582 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.181846 kubelet[2815]: E0417 02:52:26.181222 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.186798 kubelet[2815]: E0417 02:52:26.185055 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.186798 kubelet[2815]: W0417 02:52:26.185114 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.186798 kubelet[2815]: E0417 02:52:26.185187 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.191018 kubelet[2815]: E0417 02:52:26.190256 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.247654 kubelet[2815]: W0417 02:52:26.247119 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.248067 kubelet[2815]: E0417 02:52:26.247383 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.252798 kubelet[2815]: E0417 02:52:26.251926 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.254251 kubelet[2815]: W0417 02:52:26.252696 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.254506 kubelet[2815]: E0417 02:52:26.254252 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.255205 kubelet[2815]: E0417 02:52:26.255163 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.255205 kubelet[2815]: W0417 02:52:26.255194 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.255303 kubelet[2815]: E0417 02:52:26.255212 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.255686 kubelet[2815]: E0417 02:52:26.255524 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.257922 kubelet[2815]: W0417 02:52:26.256214 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.257922 kubelet[2815]: E0417 02:52:26.256674 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.258890 kubelet[2815]: E0417 02:52:26.258827 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.264853 kubelet[2815]: W0417 02:52:26.262437 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.264853 kubelet[2815]: E0417 02:52:26.263572 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.273495 kubelet[2815]: E0417 02:52:26.273086 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.275801 kubelet[2815]: W0417 02:52:26.275508 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.276851 kubelet[2815]: E0417 02:52:26.275835 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.288435 kubelet[2815]: E0417 02:52:26.278613 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.288435 kubelet[2815]: W0417 02:52:26.279488 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.296817 kubelet[2815]: E0417 02:52:26.296219 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.297503 kubelet[2815]: E0417 02:52:26.297345 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.297503 kubelet[2815]: W0417 02:52:26.297394 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.297599 kubelet[2815]: E0417 02:52:26.297567 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.298162 kubelet[2815]: E0417 02:52:26.297911 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.298162 kubelet[2815]: W0417 02:52:26.297922 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.298162 kubelet[2815]: E0417 02:52:26.297935 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.298162 kubelet[2815]: E0417 02:52:26.298094 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.298162 kubelet[2815]: W0417 02:52:26.298101 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.298162 kubelet[2815]: E0417 02:52:26.298110 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.305048 kubelet[2815]: E0417 02:52:26.303636 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.305048 kubelet[2815]: W0417 02:52:26.303877 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.305048 kubelet[2815]: E0417 02:52:26.304016 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.320485 kubelet[2815]: E0417 02:52:26.319634 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.320485 kubelet[2815]: W0417 02:52:26.320403 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.320986 kubelet[2815]: E0417 02:52:26.320563 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.788106 containerd[1591]: time="2026-04-17T02:52:26.787952477Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:26.789759 containerd[1591]: time="2026-04-17T02:52:26.789683601Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Apr 17 02:52:26.864539 containerd[1591]: time="2026-04-17T02:52:26.864470723Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:26.874614 kubelet[2815]: E0417 02:52:26.873882 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:52:26.879768 containerd[1591]: time="2026-04-17T02:52:26.878855490Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:26.890237 containerd[1591]: time="2026-04-17T02:52:26.890034615Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 2.420892831s" Apr 17 02:52:26.890851 containerd[1591]: time="2026-04-17T02:52:26.890348626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Apr 17 02:52:26.899950 kubelet[2815]: E0417 02:52:26.898419 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.901310 kubelet[2815]: W0417 02:52:26.900844 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.901310 kubelet[2815]: E0417 02:52:26.901024 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.906968 kubelet[2815]: E0417 02:52:26.904161 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.906968 kubelet[2815]: W0417 02:52:26.906812 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.906968 kubelet[2815]: E0417 02:52:26.906892 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.913593 kubelet[2815]: E0417 02:52:26.913124 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.917656 kubelet[2815]: W0417 02:52:26.915115 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.921882 kubelet[2815]: E0417 02:52:26.919465 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.925921 kubelet[2815]: E0417 02:52:26.924459 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.927353 kubelet[2815]: W0417 02:52:26.925696 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.927606 kubelet[2815]: E0417 02:52:26.927576 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.930384 kubelet[2815]: E0417 02:52:26.930347 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.930612 kubelet[2815]: W0417 02:52:26.930410 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.930612 kubelet[2815]: E0417 02:52:26.930547 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.930692 containerd[1591]: time="2026-04-17T02:52:26.930595416Z" level=info msg="CreateContainer within sandbox \"596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Apr 17 02:52:26.939773 kubelet[2815]: E0417 02:52:26.938915 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.942070 kubelet[2815]: W0417 02:52:26.941672 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.942339 kubelet[2815]: E0417 02:52:26.942110 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.942818 kubelet[2815]: E0417 02:52:26.942794 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.942818 kubelet[2815]: W0417 02:52:26.942816 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.942909 kubelet[2815]: E0417 02:52:26.942832 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.943032 kubelet[2815]: E0417 02:52:26.943008 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.943067 kubelet[2815]: W0417 02:52:26.943033 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.943067 kubelet[2815]: E0417 02:52:26.943043 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.944770 kubelet[2815]: E0417 02:52:26.944336 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.944770 kubelet[2815]: W0417 02:52:26.944662 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.945634 kubelet[2815]: E0417 02:52:26.944975 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.948962 kubelet[2815]: E0417 02:52:26.946610 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.958100 kubelet[2815]: W0417 02:52:26.948348 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.970444 kubelet[2815]: E0417 02:52:26.967396 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.984150 kubelet[2815]: E0417 02:52:26.980345 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.984150 kubelet[2815]: W0417 02:52:26.982294 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.984150 kubelet[2815]: E0417 02:52:26.982419 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.986986 kubelet[2815]: E0417 02:52:26.986831 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.986986 kubelet[2815]: W0417 02:52:26.986852 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.986986 kubelet[2815]: E0417 02:52:26.986944 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:26.988856 kubelet[2815]: E0417 02:52:26.987866 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:26.988856 kubelet[2815]: W0417 02:52:26.988109 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:26.988856 kubelet[2815]: E0417 02:52:26.988235 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.057306 kubelet[2815]: E0417 02:52:27.055323 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.057306 kubelet[2815]: W0417 02:52:27.055467 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.057306 kubelet[2815]: E0417 02:52:27.055576 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.058248 kubelet[2815]: E0417 02:52:27.057299 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.060010 kubelet[2815]: W0417 02:52:27.058311 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.060010 kubelet[2815]: E0417 02:52:27.058458 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.060489 containerd[1591]: time="2026-04-17T02:52:27.058422965Z" level=info msg="Container be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:52:27.063519 kubelet[2815]: E0417 02:52:27.061382 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.063519 kubelet[2815]: W0417 02:52:27.061468 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.063519 kubelet[2815]: E0417 02:52:27.061590 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.069839 kubelet[2815]: E0417 02:52:27.069754 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.073797 kubelet[2815]: W0417 02:52:27.069829 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.073797 kubelet[2815]: E0417 02:52:27.069923 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.078527 kubelet[2815]: E0417 02:52:27.077356 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.089849 kubelet[2815]: W0417 02:52:27.078516 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.090675 kubelet[2815]: E0417 02:52:27.090457 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.099632 containerd[1591]: time="2026-04-17T02:52:27.099515801Z" level=info msg="CreateContainer within sandbox \"596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e\"" Apr 17 02:52:27.109206 kubelet[2815]: E0417 02:52:27.108867 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.109206 kubelet[2815]: W0417 02:52:27.108929 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.109206 kubelet[2815]: E0417 02:52:27.109015 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.119842 containerd[1591]: time="2026-04-17T02:52:27.118810082Z" level=info msg="StartContainer for \"be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e\"" Apr 17 02:52:27.124481 kubelet[2815]: E0417 02:52:27.123357 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.126081 kubelet[2815]: W0417 02:52:27.124683 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.126081 kubelet[2815]: E0417 02:52:27.126076 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.142674 kubelet[2815]: E0417 02:52:27.141763 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.143302 containerd[1591]: time="2026-04-17T02:52:27.143011977Z" level=info msg="connecting to shim be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e" address="unix:///run/containerd/s/8503a913e4bd2dd78686e347d2cc5aab1d98c77fb72551513649a5a73f2273b8" protocol=ttrpc version=3 Apr 17 02:52:27.146321 kubelet[2815]: W0417 02:52:27.146021 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.157361 kubelet[2815]: E0417 02:52:27.154548 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.165019 kubelet[2815]: E0417 02:52:27.163875 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.168475 kubelet[2815]: W0417 02:52:27.167084 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.168903 kubelet[2815]: E0417 02:52:27.168589 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.175385 kubelet[2815]: E0417 02:52:27.174698 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.175770 kubelet[2815]: W0417 02:52:27.175564 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.176968 kubelet[2815]: E0417 02:52:27.175700 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.182670 kubelet[2815]: E0417 02:52:27.182502 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.183027 kubelet[2815]: W0417 02:52:27.182664 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.183027 kubelet[2815]: E0417 02:52:27.182836 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.184907 kubelet[2815]: E0417 02:52:27.183216 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.184907 kubelet[2815]: W0417 02:52:27.183226 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.184907 kubelet[2815]: E0417 02:52:27.183243 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.184907 kubelet[2815]: E0417 02:52:27.183846 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.184907 kubelet[2815]: W0417 02:52:27.183854 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.184907 kubelet[2815]: E0417 02:52:27.183864 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.241561 kubelet[2815]: E0417 02:52:27.241410 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.241561 kubelet[2815]: W0417 02:52:27.241529 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.241561 kubelet[2815]: E0417 02:52:27.241626 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.253364 kubelet[2815]: E0417 02:52:27.251809 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.253562 kubelet[2815]: W0417 02:52:27.253379 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.253562 kubelet[2815]: E0417 02:52:27.253463 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.260599 kubelet[2815]: E0417 02:52:27.260566 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.260599 kubelet[2815]: W0417 02:52:27.260593 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.260886 kubelet[2815]: E0417 02:52:27.260640 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.367088 kubelet[2815]: E0417 02:52:27.356995 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.367088 kubelet[2815]: W0417 02:52:27.357266 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.367088 kubelet[2815]: E0417 02:52:27.357362 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.367466 kubelet[2815]: E0417 02:52:27.365039 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.367466 kubelet[2815]: W0417 02:52:27.367338 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.367518 kubelet[2815]: E0417 02:52:27.367459 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.373998 systemd[1]: Started cri-containerd-be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e.scope - libcontainer container be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e. Apr 17 02:52:27.398809 kubelet[2815]: E0417 02:52:27.396991 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.398809 kubelet[2815]: W0417 02:52:27.397123 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.398809 kubelet[2815]: E0417 02:52:27.397197 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.406119 kubelet[2815]: E0417 02:52:27.404017 2815 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 02:52:27.408832 kubelet[2815]: W0417 02:52:27.408331 2815 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 02:52:27.411213 kubelet[2815]: E0417 02:52:27.410309 2815 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 02:52:27.907641 systemd[1]: cri-containerd-be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e.scope: Deactivated successfully. Apr 17 02:52:27.910300 containerd[1591]: time="2026-04-17T02:52:27.910125833Z" level=info msg="StartContainer for \"be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e\" returns successfully" Apr 17 02:52:27.927902 containerd[1591]: time="2026-04-17T02:52:27.927367672Z" level=info msg="received container exit event container_id:\"be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e\" id:\"be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e\" pid:3634 exited_at:{seconds:1776394347 nanos:920641389}" Apr 17 02:52:28.045902 kubelet[2815]: E0417 02:52:28.043497 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:52:28.129521 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e-rootfs.mount: Deactivated successfully. Apr 17 02:52:28.159097 kubelet[2815]: E0417 02:52:28.159022 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:29.085393 kubelet[2815]: E0417 02:52:29.084867 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:52:29.135291 containerd[1591]: time="2026-04-17T02:52:29.135047175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Apr 17 02:52:30.151672 kubelet[2815]: E0417 02:52:30.151578 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:32.149143 kubelet[2815]: E0417 02:52:32.148977 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:34.158134 kubelet[2815]: E0417 02:52:34.157437 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:36.159567 kubelet[2815]: E0417 02:52:36.156855 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:38.179796 kubelet[2815]: E0417 02:52:38.178310 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:40.155671 kubelet[2815]: E0417 02:52:40.154514 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:41.161439 kubelet[2815]: E0417 02:52:41.161317 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:52:42.166514 kubelet[2815]: E0417 02:52:42.163414 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:44.149480 kubelet[2815]: E0417 02:52:44.149240 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:46.148966 kubelet[2815]: E0417 02:52:46.147043 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:48.153311 kubelet[2815]: E0417 02:52:48.153239 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:48.157015 kubelet[2815]: E0417 02:52:48.156880 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:52:49.152619 kubelet[2815]: E0417 02:52:49.152539 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:52:50.198319 kubelet[2815]: E0417 02:52:50.198272 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:51.936864 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount704650553.mount: Deactivated successfully. Apr 17 02:52:52.082397 containerd[1591]: time="2026-04-17T02:52:52.082249472Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:52.190253 containerd[1591]: time="2026-04-17T02:52:52.084957914Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Apr 17 02:52:52.190253 containerd[1591]: time="2026-04-17T02:52:52.185255144Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:52.212126 kubelet[2815]: E0417 02:52:52.211761 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:52.223593 containerd[1591]: time="2026-04-17T02:52:52.223082559Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:52.228751 kubelet[2815]: E0417 02:52:52.228062 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:52:52.231786 containerd[1591]: time="2026-04-17T02:52:52.231516381Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 23.096138423s" Apr 17 02:52:52.232076 containerd[1591]: time="2026-04-17T02:52:52.231834592Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Apr 17 02:52:52.262409 containerd[1591]: time="2026-04-17T02:52:52.262069992Z" level=info msg="CreateContainer within sandbox \"596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Apr 17 02:52:52.522256 containerd[1591]: time="2026-04-17T02:52:52.522139859Z" level=info msg="Container 5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:52:52.546384 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1675676742.mount: Deactivated successfully. Apr 17 02:52:52.689224 containerd[1591]: time="2026-04-17T02:52:52.688632732Z" level=info msg="CreateContainer within sandbox \"596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316\"" Apr 17 02:52:52.702314 containerd[1591]: time="2026-04-17T02:52:52.699579995Z" level=info msg="StartContainer for \"5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316\"" Apr 17 02:52:52.763502 containerd[1591]: time="2026-04-17T02:52:52.763445884Z" level=info msg="connecting to shim 5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316" address="unix:///run/containerd/s/8503a913e4bd2dd78686e347d2cc5aab1d98c77fb72551513649a5a73f2273b8" protocol=ttrpc version=3 Apr 17 02:52:52.850953 systemd[1]: Started cri-containerd-5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316.scope - libcontainer container 5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316. Apr 17 02:52:53.088327 containerd[1591]: time="2026-04-17T02:52:53.088243996Z" level=info msg="StartContainer for \"5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316\" returns successfully" Apr 17 02:52:53.518492 systemd[1]: cri-containerd-5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316.scope: Deactivated successfully. Apr 17 02:52:53.529106 containerd[1591]: time="2026-04-17T02:52:53.528957705Z" level=info msg="received container exit event container_id:\"5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316\" id:\"5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316\" pid:3699 exited_at:{seconds:1776394373 nanos:519468607}" Apr 17 02:52:53.655629 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316-rootfs.mount: Deactivated successfully. Apr 17 02:52:53.876212 containerd[1591]: time="2026-04-17T02:52:53.873406117Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Apr 17 02:52:54.227858 kubelet[2815]: E0417 02:52:54.226536 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:56.155662 kubelet[2815]: E0417 02:52:56.155464 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:58.170794 kubelet[2815]: E0417 02:52:58.169225 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:52:59.873469 containerd[1591]: time="2026-04-17T02:52:59.873396844Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:59.876891 containerd[1591]: time="2026-04-17T02:52:59.874633392Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Apr 17 02:52:59.883046 containerd[1591]: time="2026-04-17T02:52:59.882790308Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:59.892811 containerd[1591]: time="2026-04-17T02:52:59.892527773Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:52:59.900029 containerd[1591]: time="2026-04-17T02:52:59.899825611Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 6.026357874s" Apr 17 02:52:59.901705 containerd[1591]: time="2026-04-17T02:52:59.900086282Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Apr 17 02:52:59.996655 containerd[1591]: time="2026-04-17T02:52:59.996457462Z" level=info msg="CreateContainer within sandbox \"596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Apr 17 02:53:00.022791 containerd[1591]: time="2026-04-17T02:53:00.021084841Z" level=info msg="Container 77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:53:00.061479 containerd[1591]: time="2026-04-17T02:53:00.061375397Z" level=info msg="CreateContainer within sandbox \"596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd\"" Apr 17 02:53:00.067271 containerd[1591]: time="2026-04-17T02:53:00.066955909Z" level=info msg="StartContainer for \"77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd\"" Apr 17 02:53:00.077250 containerd[1591]: time="2026-04-17T02:53:00.076960768Z" level=info msg="connecting to shim 77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd" address="unix:///run/containerd/s/8503a913e4bd2dd78686e347d2cc5aab1d98c77fb72551513649a5a73f2273b8" protocol=ttrpc version=3 Apr 17 02:53:00.168153 kubelet[2815]: E0417 02:53:00.167012 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:53:00.192109 systemd[1]: Started cri-containerd-77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd.scope - libcontainer container 77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd. Apr 17 02:53:00.420310 containerd[1591]: time="2026-04-17T02:53:00.419876358Z" level=info msg="StartContainer for \"77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd\" returns successfully" Apr 17 02:53:02.174854 kubelet[2815]: E0417 02:53:02.174408 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:53:03.950759 systemd[1]: cri-containerd-77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd.scope: Deactivated successfully. Apr 17 02:53:03.955807 systemd[1]: cri-containerd-77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd.scope: Consumed 2.526s CPU time, 180.3M memory peak, 4.8M read from disk, 177M written to disk. Apr 17 02:53:04.021172 containerd[1591]: time="2026-04-17T02:53:04.020693858Z" level=info msg="received container exit event container_id:\"77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd\" id:\"77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd\" pid:3758 exited_at:{seconds:1776394384 nanos:18581998}" Apr 17 02:53:04.049996 kubelet[2815]: I0417 02:53:04.047691 2815 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Apr 17 02:53:04.192992 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd-rootfs.mount: Deactivated successfully. Apr 17 02:53:04.280058 systemd[1]: Created slice kubepods-besteffort-podf2c15a0f_c771_44b2_be28_01bbe4cb1c8b.slice - libcontainer container kubepods-besteffort-podf2c15a0f_c771_44b2_be28_01bbe4cb1c8b.slice. Apr 17 02:53:04.348445 containerd[1591]: time="2026-04-17T02:53:04.348365265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vnr2q,Uid:f2c15a0f-c771-44b2-be28-01bbe4cb1c8b,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:04.710396 systemd[1]: Created slice kubepods-besteffort-pod359f9669_8513_45dd_ad4a_a69e69851e6f.slice - libcontainer container kubepods-besteffort-pod359f9669_8513_45dd_ad4a_a69e69851e6f.slice. Apr 17 02:53:04.778036 systemd[1]: Created slice kubepods-besteffort-pode186d35c_95f3_490a_bcc6_ab6d243181fe.slice - libcontainer container kubepods-besteffort-pode186d35c_95f3_490a_bcc6_ab6d243181fe.slice. Apr 17 02:53:04.822946 kubelet[2815]: I0417 02:53:04.822561 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e186d35c-95f3-490a-bcc6-ab6d243181fe-tigera-ca-bundle\") pod \"calico-kube-controllers-6fbccc5db4-vvpvq\" (UID: \"e186d35c-95f3-490a-bcc6-ab6d243181fe\") " pod="calico-system/calico-kube-controllers-6fbccc5db4-vvpvq" Apr 17 02:53:04.841844 kubelet[2815]: I0417 02:53:04.823494 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bgcf\" (UniqueName: \"kubernetes.io/projected/e186d35c-95f3-490a-bcc6-ab6d243181fe-kube-api-access-7bgcf\") pod \"calico-kube-controllers-6fbccc5db4-vvpvq\" (UID: \"e186d35c-95f3-490a-bcc6-ab6d243181fe\") " pod="calico-system/calico-kube-controllers-6fbccc5db4-vvpvq" Apr 17 02:53:04.841844 kubelet[2815]: I0417 02:53:04.823556 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/359f9669-8513-45dd-ad4a-a69e69851e6f-calico-apiserver-certs\") pod \"calico-apiserver-58fdb9b8fb-4dckf\" (UID: \"359f9669-8513-45dd-ad4a-a69e69851e6f\") " pod="calico-system/calico-apiserver-58fdb9b8fb-4dckf" Apr 17 02:53:04.841844 kubelet[2815]: I0417 02:53:04.823578 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6k5q\" (UniqueName: \"kubernetes.io/projected/359f9669-8513-45dd-ad4a-a69e69851e6f-kube-api-access-r6k5q\") pod \"calico-apiserver-58fdb9b8fb-4dckf\" (UID: \"359f9669-8513-45dd-ad4a-a69e69851e6f\") " pod="calico-system/calico-apiserver-58fdb9b8fb-4dckf" Apr 17 02:53:04.889828 systemd[1]: Created slice kubepods-besteffort-pod0877677f_504a_4651_a066_b1c4f2fa0fee.slice - libcontainer container kubepods-besteffort-pod0877677f_504a_4651_a066_b1c4f2fa0fee.slice. Apr 17 02:53:04.952069 kubelet[2815]: I0417 02:53:04.952002 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8336d134-f31e-409c-b432-def12e532bd8-config-volume\") pod \"coredns-66bc5c9577-rjxxk\" (UID: \"8336d134-f31e-409c-b432-def12e532bd8\") " pod="kube-system/coredns-66bc5c9577-rjxxk" Apr 17 02:53:04.952359 kubelet[2815]: I0417 02:53:04.952130 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/503bddcb-5ae9-4596-93b1-28b74f7a0d4b-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-7sj6j\" (UID: \"503bddcb-5ae9-4596-93b1-28b74f7a0d4b\") " pod="calico-system/goldmane-cccfbd5cf-7sj6j" Apr 17 02:53:04.952359 kubelet[2815]: I0417 02:53:04.952155 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhsx2\" (UniqueName: \"kubernetes.io/projected/503bddcb-5ae9-4596-93b1-28b74f7a0d4b-kube-api-access-lhsx2\") pod \"goldmane-cccfbd5cf-7sj6j\" (UID: \"503bddcb-5ae9-4596-93b1-28b74f7a0d4b\") " pod="calico-system/goldmane-cccfbd5cf-7sj6j" Apr 17 02:53:04.952359 kubelet[2815]: I0417 02:53:04.952187 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkz9k\" (UniqueName: \"kubernetes.io/projected/8336d134-f31e-409c-b432-def12e532bd8-kube-api-access-pkz9k\") pod \"coredns-66bc5c9577-rjxxk\" (UID: \"8336d134-f31e-409c-b432-def12e532bd8\") " pod="kube-system/coredns-66bc5c9577-rjxxk" Apr 17 02:53:04.952359 kubelet[2815]: I0417 02:53:04.952212 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/503bddcb-5ae9-4596-93b1-28b74f7a0d4b-config\") pod \"goldmane-cccfbd5cf-7sj6j\" (UID: \"503bddcb-5ae9-4596-93b1-28b74f7a0d4b\") " pod="calico-system/goldmane-cccfbd5cf-7sj6j" Apr 17 02:53:04.952359 kubelet[2815]: I0417 02:53:04.952231 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0877677f-504a-4651-a066-b1c4f2fa0fee-calico-apiserver-certs\") pod \"calico-apiserver-58fdb9b8fb-rtk29\" (UID: \"0877677f-504a-4651-a066-b1c4f2fa0fee\") " pod="calico-system/calico-apiserver-58fdb9b8fb-rtk29" Apr 17 02:53:04.952592 kubelet[2815]: I0417 02:53:04.952273 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6cda6d1-e063-4ab6-9303-448552be0606-whisker-ca-bundle\") pod \"whisker-758f6d4c78-q7bnk\" (UID: \"e6cda6d1-e063-4ab6-9303-448552be0606\") " pod="calico-system/whisker-758f6d4c78-q7bnk" Apr 17 02:53:04.952592 kubelet[2815]: I0417 02:53:04.952294 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fp5c\" (UniqueName: \"kubernetes.io/projected/e6cda6d1-e063-4ab6-9303-448552be0606-kube-api-access-2fp5c\") pod \"whisker-758f6d4c78-q7bnk\" (UID: \"e6cda6d1-e063-4ab6-9303-448552be0606\") " pod="calico-system/whisker-758f6d4c78-q7bnk" Apr 17 02:53:04.952592 kubelet[2815]: I0417 02:53:04.952307 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c3706c6-5e02-4b80-b77e-666efd679ffb-config-volume\") pod \"coredns-66bc5c9577-r6dvt\" (UID: \"9c3706c6-5e02-4b80-b77e-666efd679ffb\") " pod="kube-system/coredns-66bc5c9577-r6dvt" Apr 17 02:53:04.952592 kubelet[2815]: I0417 02:53:04.952324 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/e6cda6d1-e063-4ab6-9303-448552be0606-nginx-config\") pod \"whisker-758f6d4c78-q7bnk\" (UID: \"e6cda6d1-e063-4ab6-9303-448552be0606\") " pod="calico-system/whisker-758f6d4c78-q7bnk" Apr 17 02:53:04.952592 kubelet[2815]: I0417 02:53:04.952341 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m9b6\" (UniqueName: \"kubernetes.io/projected/0877677f-504a-4651-a066-b1c4f2fa0fee-kube-api-access-8m9b6\") pod \"calico-apiserver-58fdb9b8fb-rtk29\" (UID: \"0877677f-504a-4651-a066-b1c4f2fa0fee\") " pod="calico-system/calico-apiserver-58fdb9b8fb-rtk29" Apr 17 02:53:04.953306 kubelet[2815]: I0417 02:53:04.952353 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffrpx\" (UniqueName: \"kubernetes.io/projected/9c3706c6-5e02-4b80-b77e-666efd679ffb-kube-api-access-ffrpx\") pod \"coredns-66bc5c9577-r6dvt\" (UID: \"9c3706c6-5e02-4b80-b77e-666efd679ffb\") " pod="kube-system/coredns-66bc5c9577-r6dvt" Apr 17 02:53:04.953306 kubelet[2815]: I0417 02:53:04.952370 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e6cda6d1-e063-4ab6-9303-448552be0606-whisker-backend-key-pair\") pod \"whisker-758f6d4c78-q7bnk\" (UID: \"e6cda6d1-e063-4ab6-9303-448552be0606\") " pod="calico-system/whisker-758f6d4c78-q7bnk" Apr 17 02:53:04.953306 kubelet[2815]: I0417 02:53:04.952384 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/503bddcb-5ae9-4596-93b1-28b74f7a0d4b-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-7sj6j\" (UID: \"503bddcb-5ae9-4596-93b1-28b74f7a0d4b\") " pod="calico-system/goldmane-cccfbd5cf-7sj6j" Apr 17 02:53:04.971144 systemd[1]: Created slice kubepods-besteffort-pode6cda6d1_e063_4ab6_9303_448552be0606.slice - libcontainer container kubepods-besteffort-pode6cda6d1_e063_4ab6_9303_448552be0606.slice. Apr 17 02:53:05.063655 systemd[1]: Created slice kubepods-besteffort-pod503bddcb_5ae9_4596_93b1_28b74f7a0d4b.slice - libcontainer container kubepods-besteffort-pod503bddcb_5ae9_4596_93b1_28b74f7a0d4b.slice. Apr 17 02:53:05.222354 systemd[1]: Created slice kubepods-burstable-pod8336d134_f31e_409c_b432_def12e532bd8.slice - libcontainer container kubepods-burstable-pod8336d134_f31e_409c_b432_def12e532bd8.slice. Apr 17 02:53:05.272384 systemd[1]: Created slice kubepods-burstable-pod9c3706c6_5e02_4b80_b77e_666efd679ffb.slice - libcontainer container kubepods-burstable-pod9c3706c6_5e02_4b80_b77e_666efd679ffb.slice. Apr 17 02:53:05.611471 containerd[1591]: time="2026-04-17T02:53:05.611333391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fbccc5db4-vvpvq,Uid:e186d35c-95f3-490a-bcc6-ab6d243181fe,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:05.633631 containerd[1591]: time="2026-04-17T02:53:05.633522722Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58fdb9b8fb-rtk29,Uid:0877677f-504a-4651-a066-b1c4f2fa0fee,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:05.672957 containerd[1591]: time="2026-04-17T02:53:05.672004906Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58fdb9b8fb-4dckf,Uid:359f9669-8513-45dd-ad4a-a69e69851e6f,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:05.766575 containerd[1591]: time="2026-04-17T02:53:05.766333940Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-7sj6j,Uid:503bddcb-5ae9-4596-93b1-28b74f7a0d4b,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:05.775681 containerd[1591]: time="2026-04-17T02:53:05.775574644Z" level=error msg="Failed to destroy network for sandbox \"9efa112988806545ca0a0af400e4538d909d12bb3f4bb81d87714d044beaf88c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:05.780829 kubelet[2815]: E0417 02:53:05.779606 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:05.855503 containerd[1591]: time="2026-04-17T02:53:05.851115481Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-r6dvt,Uid:9c3706c6-5e02-4b80-b77e-666efd679ffb,Namespace:kube-system,Attempt:0,}" Apr 17 02:53:05.875884 kubelet[2815]: E0417 02:53:05.870386 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:05.919444 containerd[1591]: time="2026-04-17T02:53:05.919165240Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vnr2q,Uid:f2c15a0f-c771-44b2-be28-01bbe4cb1c8b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9efa112988806545ca0a0af400e4538d909d12bb3f4bb81d87714d044beaf88c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:05.934024 containerd[1591]: time="2026-04-17T02:53:05.933930642Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rjxxk,Uid:8336d134-f31e-409c-b432-def12e532bd8,Namespace:kube-system,Attempt:0,}" Apr 17 02:53:05.938930 kubelet[2815]: E0417 02:53:05.938404 2815 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9efa112988806545ca0a0af400e4538d909d12bb3f4bb81d87714d044beaf88c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:05.938930 kubelet[2815]: E0417 02:53:05.938563 2815 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9efa112988806545ca0a0af400e4538d909d12bb3f4bb81d87714d044beaf88c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vnr2q" Apr 17 02:53:05.938930 kubelet[2815]: E0417 02:53:05.938587 2815 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9efa112988806545ca0a0af400e4538d909d12bb3f4bb81d87714d044beaf88c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vnr2q" Apr 17 02:53:05.939332 kubelet[2815]: E0417 02:53:05.938648 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vnr2q_calico-system(f2c15a0f-c771-44b2-be28-01bbe4cb1c8b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vnr2q_calico-system(f2c15a0f-c771-44b2-be28-01bbe4cb1c8b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9efa112988806545ca0a0af400e4538d909d12bb3f4bb81d87714d044beaf88c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vnr2q" podUID="f2c15a0f-c771-44b2-be28-01bbe4cb1c8b" Apr 17 02:53:05.944773 containerd[1591]: time="2026-04-17T02:53:05.943645378Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-758f6d4c78-q7bnk,Uid:e6cda6d1-e063-4ab6-9303-448552be0606,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:06.054831 containerd[1591]: time="2026-04-17T02:53:06.054506828Z" level=info msg="CreateContainer within sandbox \"596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Apr 17 02:53:06.229890 containerd[1591]: time="2026-04-17T02:53:06.226119893Z" level=info msg="Container 817d2d727bd5978bb30e080d429a071dedf866acf3e576b7803f527897950a8e: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:53:06.281687 systemd[1]: run-netns-cni\x2ddd72e129\x2d2c22\x2d5feb\x2d4a02\x2d85b7be542b33.mount: Deactivated successfully. Apr 17 02:53:06.439779 containerd[1591]: time="2026-04-17T02:53:06.438635931Z" level=info msg="CreateContainer within sandbox \"596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"817d2d727bd5978bb30e080d429a071dedf866acf3e576b7803f527897950a8e\"" Apr 17 02:53:06.452292 containerd[1591]: time="2026-04-17T02:53:06.452103232Z" level=info msg="StartContainer for \"817d2d727bd5978bb30e080d429a071dedf866acf3e576b7803f527897950a8e\"" Apr 17 02:53:06.578999 containerd[1591]: time="2026-04-17T02:53:06.576909528Z" level=info msg="connecting to shim 817d2d727bd5978bb30e080d429a071dedf866acf3e576b7803f527897950a8e" address="unix:///run/containerd/s/8503a913e4bd2dd78686e347d2cc5aab1d98c77fb72551513649a5a73f2273b8" protocol=ttrpc version=3 Apr 17 02:53:07.088258 systemd[1]: Started cri-containerd-817d2d727bd5978bb30e080d429a071dedf866acf3e576b7803f527897950a8e.scope - libcontainer container 817d2d727bd5978bb30e080d429a071dedf866acf3e576b7803f527897950a8e. Apr 17 02:53:07.980765 containerd[1591]: time="2026-04-17T02:53:07.976505225Z" level=error msg="Failed to destroy network for sandbox \"1cd46ac4c7b8ef0881136a125c904e080b4d2f28880af9b06d9f78a82a5eceb4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:07.989036 systemd[1]: run-netns-cni\x2d827c661b\x2d9ff7\x2d6e96\x2d8af4\x2dedd5316836cc.mount: Deactivated successfully. Apr 17 02:53:07.992487 containerd[1591]: time="2026-04-17T02:53:07.991035739Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rjxxk,Uid:8336d134-f31e-409c-b432-def12e532bd8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cd46ac4c7b8ef0881136a125c904e080b4d2f28880af9b06d9f78a82a5eceb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.010383 kubelet[2815]: E0417 02:53:08.009300 2815 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cd46ac4c7b8ef0881136a125c904e080b4d2f28880af9b06d9f78a82a5eceb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.011214 kubelet[2815]: E0417 02:53:08.010762 2815 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cd46ac4c7b8ef0881136a125c904e080b4d2f28880af9b06d9f78a82a5eceb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-rjxxk" Apr 17 02:53:08.014358 kubelet[2815]: E0417 02:53:08.011352 2815 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cd46ac4c7b8ef0881136a125c904e080b4d2f28880af9b06d9f78a82a5eceb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-rjxxk" Apr 17 02:53:08.016380 kubelet[2815]: E0417 02:53:08.015428 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-rjxxk_kube-system(8336d134-f31e-409c-b432-def12e532bd8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-rjxxk_kube-system(8336d134-f31e-409c-b432-def12e532bd8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1cd46ac4c7b8ef0881136a125c904e080b4d2f28880af9b06d9f78a82a5eceb4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-rjxxk" podUID="8336d134-f31e-409c-b432-def12e532bd8" Apr 17 02:53:08.158804 containerd[1591]: time="2026-04-17T02:53:08.157323650Z" level=error msg="Failed to destroy network for sandbox \"b043394cf3bf43b0d1a656cc79d7117c272ef42124a8f44ee5cc1c553fb96a88\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.165358 systemd[1]: run-netns-cni\x2d944332a8\x2dcebb\x2d570b\x2d9089\x2de9892cfe6ae4.mount: Deactivated successfully. Apr 17 02:53:08.213537 containerd[1591]: time="2026-04-17T02:53:08.213366990Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58fdb9b8fb-4dckf,Uid:359f9669-8513-45dd-ad4a-a69e69851e6f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b043394cf3bf43b0d1a656cc79d7117c272ef42124a8f44ee5cc1c553fb96a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.214281 kubelet[2815]: E0417 02:53:08.214194 2815 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b043394cf3bf43b0d1a656cc79d7117c272ef42124a8f44ee5cc1c553fb96a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.214406 kubelet[2815]: E0417 02:53:08.214311 2815 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b043394cf3bf43b0d1a656cc79d7117c272ef42124a8f44ee5cc1c553fb96a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-58fdb9b8fb-4dckf" Apr 17 02:53:08.214406 kubelet[2815]: E0417 02:53:08.214341 2815 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b043394cf3bf43b0d1a656cc79d7117c272ef42124a8f44ee5cc1c553fb96a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-58fdb9b8fb-4dckf" Apr 17 02:53:08.214530 kubelet[2815]: E0417 02:53:08.214444 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-58fdb9b8fb-4dckf_calico-system(359f9669-8513-45dd-ad4a-a69e69851e6f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-58fdb9b8fb-4dckf_calico-system(359f9669-8513-45dd-ad4a-a69e69851e6f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b043394cf3bf43b0d1a656cc79d7117c272ef42124a8f44ee5cc1c553fb96a88\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-58fdb9b8fb-4dckf" podUID="359f9669-8513-45dd-ad4a-a69e69851e6f" Apr 17 02:53:08.531320 containerd[1591]: time="2026-04-17T02:53:08.530391613Z" level=error msg="Failed to destroy network for sandbox \"6f1252023d465e16f97d0fd241e1f8c337e83e646c3589e4eabef63302a14c8b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.539951 systemd[1]: run-netns-cni\x2dab52eb3c\x2d9432\x2d7f23\x2dcbcb\x2dd151674eb28d.mount: Deactivated successfully. Apr 17 02:53:08.589249 containerd[1591]: time="2026-04-17T02:53:08.589063362Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-r6dvt,Uid:9c3706c6-5e02-4b80-b77e-666efd679ffb,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f1252023d465e16f97d0fd241e1f8c337e83e646c3589e4eabef63302a14c8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.590308 kubelet[2815]: E0417 02:53:08.590169 2815 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f1252023d465e16f97d0fd241e1f8c337e83e646c3589e4eabef63302a14c8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.590666 containerd[1591]: time="2026-04-17T02:53:08.590589837Z" level=error msg="Failed to destroy network for sandbox \"23ea60629d36bef69bff30b2389c3b020a62b8400f8908b0a396004b9d21eb04\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.591551 kubelet[2815]: E0417 02:53:08.591284 2815 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f1252023d465e16f97d0fd241e1f8c337e83e646c3589e4eabef63302a14c8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-r6dvt" Apr 17 02:53:08.599000 systemd[1]: run-netns-cni\x2dbb046504\x2d96cf\x2dd5a6\x2d5893\x2d31564aad6aa7.mount: Deactivated successfully. Apr 17 02:53:08.600195 kubelet[2815]: E0417 02:53:08.595502 2815 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f1252023d465e16f97d0fd241e1f8c337e83e646c3589e4eabef63302a14c8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-r6dvt" Apr 17 02:53:08.634240 kubelet[2815]: E0417 02:53:08.634102 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-r6dvt_kube-system(9c3706c6-5e02-4b80-b77e-666efd679ffb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-r6dvt_kube-system(9c3706c6-5e02-4b80-b77e-666efd679ffb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f1252023d465e16f97d0fd241e1f8c337e83e646c3589e4eabef63302a14c8b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-r6dvt" podUID="9c3706c6-5e02-4b80-b77e-666efd679ffb" Apr 17 02:53:08.650810 containerd[1591]: time="2026-04-17T02:53:08.650289846Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58fdb9b8fb-rtk29,Uid:0877677f-504a-4651-a066-b1c4f2fa0fee,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"23ea60629d36bef69bff30b2389c3b020a62b8400f8908b0a396004b9d21eb04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.654335 kubelet[2815]: E0417 02:53:08.654105 2815 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23ea60629d36bef69bff30b2389c3b020a62b8400f8908b0a396004b9d21eb04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.660951 kubelet[2815]: E0417 02:53:08.660067 2815 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23ea60629d36bef69bff30b2389c3b020a62b8400f8908b0a396004b9d21eb04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-58fdb9b8fb-rtk29" Apr 17 02:53:08.660951 kubelet[2815]: E0417 02:53:08.660449 2815 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23ea60629d36bef69bff30b2389c3b020a62b8400f8908b0a396004b9d21eb04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-58fdb9b8fb-rtk29" Apr 17 02:53:08.664786 kubelet[2815]: E0417 02:53:08.663781 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-58fdb9b8fb-rtk29_calico-system(0877677f-504a-4651-a066-b1c4f2fa0fee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-58fdb9b8fb-rtk29_calico-system(0877677f-504a-4651-a066-b1c4f2fa0fee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"23ea60629d36bef69bff30b2389c3b020a62b8400f8908b0a396004b9d21eb04\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-58fdb9b8fb-rtk29" podUID="0877677f-504a-4651-a066-b1c4f2fa0fee" Apr 17 02:53:08.840699 containerd[1591]: time="2026-04-17T02:53:08.834536913Z" level=error msg="Failed to destroy network for sandbox \"6998babb826a74bf758829b89891964036502ed2f1012817231b3338a1a32669\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.883665 systemd[1]: run-netns-cni\x2d5625c413\x2de9d3\x2dcf7d\x2dae45\x2dfaeba1145a70.mount: Deactivated successfully. Apr 17 02:53:08.890764 containerd[1591]: time="2026-04-17T02:53:08.887329932Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fbccc5db4-vvpvq,Uid:e186d35c-95f3-490a-bcc6-ab6d243181fe,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6998babb826a74bf758829b89891964036502ed2f1012817231b3338a1a32669\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.946908 kubelet[2815]: E0417 02:53:08.939174 2815 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6998babb826a74bf758829b89891964036502ed2f1012817231b3338a1a32669\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.946908 kubelet[2815]: E0417 02:53:08.940477 2815 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6998babb826a74bf758829b89891964036502ed2f1012817231b3338a1a32669\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fbccc5db4-vvpvq" Apr 17 02:53:08.946908 kubelet[2815]: E0417 02:53:08.940646 2815 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6998babb826a74bf758829b89891964036502ed2f1012817231b3338a1a32669\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fbccc5db4-vvpvq" Apr 17 02:53:08.949197 kubelet[2815]: E0417 02:53:08.945476 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6fbccc5db4-vvpvq_calico-system(e186d35c-95f3-490a-bcc6-ab6d243181fe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6fbccc5db4-vvpvq_calico-system(e186d35c-95f3-490a-bcc6-ab6d243181fe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6998babb826a74bf758829b89891964036502ed2f1012817231b3338a1a32669\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6fbccc5db4-vvpvq" podUID="e186d35c-95f3-490a-bcc6-ab6d243181fe" Apr 17 02:53:08.950134 containerd[1591]: time="2026-04-17T02:53:08.949956342Z" level=error msg="Failed to destroy network for sandbox \"3a468a678e2faa29b047fd71b43f1a9c869a642841f139f42b19a761752c54c4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.954585 containerd[1591]: time="2026-04-17T02:53:08.954387713Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-7sj6j,Uid:503bddcb-5ae9-4596-93b1-28b74f7a0d4b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a468a678e2faa29b047fd71b43f1a9c869a642841f139f42b19a761752c54c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.956159 kubelet[2815]: E0417 02:53:08.955640 2815 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a468a678e2faa29b047fd71b43f1a9c869a642841f139f42b19a761752c54c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.958621 kubelet[2815]: E0417 02:53:08.958332 2815 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a468a678e2faa29b047fd71b43f1a9c869a642841f139f42b19a761752c54c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-7sj6j" Apr 17 02:53:08.969380 kubelet[2815]: E0417 02:53:08.966665 2815 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a468a678e2faa29b047fd71b43f1a9c869a642841f139f42b19a761752c54c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-7sj6j" Apr 17 02:53:08.969685 containerd[1591]: time="2026-04-17T02:53:08.968902575Z" level=info msg="StartContainer for \"817d2d727bd5978bb30e080d429a071dedf866acf3e576b7803f527897950a8e\" returns successfully" Apr 17 02:53:08.979659 containerd[1591]: time="2026-04-17T02:53:08.974182415Z" level=error msg="Failed to destroy network for sandbox \"f0e1cb218255950c15326bb1ba71bb7d77a63cc706a1104273e824a7949d538f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.980590 kubelet[2815]: E0417 02:53:08.976408 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-7sj6j_calico-system(503bddcb-5ae9-4596-93b1-28b74f7a0d4b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-7sj6j_calico-system(503bddcb-5ae9-4596-93b1-28b74f7a0d4b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a468a678e2faa29b047fd71b43f1a9c869a642841f139f42b19a761752c54c4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-7sj6j" podUID="503bddcb-5ae9-4596-93b1-28b74f7a0d4b" Apr 17 02:53:08.985408 systemd[1]: run-netns-cni\x2d5bdb7ef1\x2d2016\x2d1b6c\x2d5178\x2d2357ba05f706.mount: Deactivated successfully. Apr 17 02:53:08.987206 containerd[1591]: time="2026-04-17T02:53:08.987086892Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-758f6d4c78-q7bnk,Uid:e6cda6d1-e063-4ab6-9303-448552be0606,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0e1cb218255950c15326bb1ba71bb7d77a63cc706a1104273e824a7949d538f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:08.987804 systemd[1]: run-netns-cni\x2d0e579ed6\x2de6f3\x2d3719\x2dae15\x2dd092cc642158.mount: Deactivated successfully. Apr 17 02:53:09.008166 kubelet[2815]: E0417 02:53:09.007923 2815 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0e1cb218255950c15326bb1ba71bb7d77a63cc706a1104273e824a7949d538f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 02:53:09.010257 kubelet[2815]: E0417 02:53:09.009489 2815 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0e1cb218255950c15326bb1ba71bb7d77a63cc706a1104273e824a7949d538f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-758f6d4c78-q7bnk" Apr 17 02:53:09.011065 kubelet[2815]: E0417 02:53:09.010411 2815 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0e1cb218255950c15326bb1ba71bb7d77a63cc706a1104273e824a7949d538f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-758f6d4c78-q7bnk" Apr 17 02:53:09.011964 kubelet[2815]: E0417 02:53:09.011107 2815 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-758f6d4c78-q7bnk_calico-system(e6cda6d1-e063-4ab6-9303-448552be0606)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-758f6d4c78-q7bnk_calico-system(e6cda6d1-e063-4ab6-9303-448552be0606)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f0e1cb218255950c15326bb1ba71bb7d77a63cc706a1104273e824a7949d538f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-758f6d4c78-q7bnk" podUID="e6cda6d1-e063-4ab6-9303-448552be0606" Apr 17 02:53:09.829734 kubelet[2815]: I0417 02:53:09.829418 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-57skb" podStartSLOduration=12.033131086 podStartE2EDuration="50.829358482s" podCreationTimestamp="2026-04-17 02:52:19 +0000 UTC" firstStartedPulling="2026-04-17 02:52:21.154210832 +0000 UTC m=+51.304157938" lastFinishedPulling="2026-04-17 02:52:59.950438232 +0000 UTC m=+90.100385334" observedRunningTime="2026-04-17 02:53:09.824610361 +0000 UTC m=+99.974557472" watchObservedRunningTime="2026-04-17 02:53:09.829358482 +0000 UTC m=+99.979305809" Apr 17 02:53:14.095902 kubelet[2815]: I0417 02:53:14.091616 2815 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6cda6d1-e063-4ab6-9303-448552be0606-whisker-ca-bundle\") pod \"e6cda6d1-e063-4ab6-9303-448552be0606\" (UID: \"e6cda6d1-e063-4ab6-9303-448552be0606\") " Apr 17 02:53:14.099143 kubelet[2815]: I0417 02:53:14.098622 2815 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6cda6d1-e063-4ab6-9303-448552be0606-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "e6cda6d1-e063-4ab6-9303-448552be0606" (UID: "e6cda6d1-e063-4ab6-9303-448552be0606"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 02:53:14.099476 kubelet[2815]: I0417 02:53:14.097704 2815 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fp5c\" (UniqueName: \"kubernetes.io/projected/e6cda6d1-e063-4ab6-9303-448552be0606-kube-api-access-2fp5c\") pod \"e6cda6d1-e063-4ab6-9303-448552be0606\" (UID: \"e6cda6d1-e063-4ab6-9303-448552be0606\") " Apr 17 02:53:14.099476 kubelet[2815]: I0417 02:53:14.099429 2815 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/e6cda6d1-e063-4ab6-9303-448552be0606-nginx-config\") pod \"e6cda6d1-e063-4ab6-9303-448552be0606\" (UID: \"e6cda6d1-e063-4ab6-9303-448552be0606\") " Apr 17 02:53:14.099542 kubelet[2815]: I0417 02:53:14.099491 2815 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e6cda6d1-e063-4ab6-9303-448552be0606-whisker-backend-key-pair\") pod \"e6cda6d1-e063-4ab6-9303-448552be0606\" (UID: \"e6cda6d1-e063-4ab6-9303-448552be0606\") " Apr 17 02:53:14.106458 kubelet[2815]: I0417 02:53:14.099937 2815 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6cda6d1-e063-4ab6-9303-448552be0606-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Apr 17 02:53:14.173823 kubelet[2815]: I0417 02:53:14.172608 2815 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6cda6d1-e063-4ab6-9303-448552be0606-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "e6cda6d1-e063-4ab6-9303-448552be0606" (UID: "e6cda6d1-e063-4ab6-9303-448552be0606"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 02:53:14.199673 systemd[1]: var-lib-kubelet-pods-e6cda6d1\x2de063\x2d4ab6\x2d9303\x2d448552be0606-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d2fp5c.mount: Deactivated successfully. Apr 17 02:53:14.208982 kubelet[2815]: I0417 02:53:14.208109 2815 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/e6cda6d1-e063-4ab6-9303-448552be0606-nginx-config\") on node \"localhost\" DevicePath \"\"" Apr 17 02:53:14.239160 systemd[1]: var-lib-kubelet-pods-e6cda6d1\x2de063\x2d4ab6\x2d9303\x2d448552be0606-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Apr 17 02:53:14.256097 kubelet[2815]: I0417 02:53:14.255244 2815 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6cda6d1-e063-4ab6-9303-448552be0606-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "e6cda6d1-e063-4ab6-9303-448552be0606" (UID: "e6cda6d1-e063-4ab6-9303-448552be0606"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 02:53:14.300477 kubelet[2815]: I0417 02:53:14.264623 2815 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6cda6d1-e063-4ab6-9303-448552be0606-kube-api-access-2fp5c" (OuterVolumeSpecName: "kube-api-access-2fp5c") pod "e6cda6d1-e063-4ab6-9303-448552be0606" (UID: "e6cda6d1-e063-4ab6-9303-448552be0606"). InnerVolumeSpecName "kube-api-access-2fp5c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 02:53:14.324924 kubelet[2815]: I0417 02:53:14.324815 2815 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e6cda6d1-e063-4ab6-9303-448552be0606-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Apr 17 02:53:14.347453 systemd[1]: Removed slice kubepods-besteffort-pode6cda6d1_e063_4ab6_9303_448552be0606.slice - libcontainer container kubepods-besteffort-pode6cda6d1_e063_4ab6_9303_448552be0606.slice. Apr 17 02:53:14.469759 kubelet[2815]: I0417 02:53:14.469591 2815 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2fp5c\" (UniqueName: \"kubernetes.io/projected/e6cda6d1-e063-4ab6-9303-448552be0606-kube-api-access-2fp5c\") on node \"localhost\" DevicePath \"\"" Apr 17 02:53:15.541956 systemd[1]: Created slice kubepods-besteffort-pod5465ee87_9c89_4df3_960a_e3af6690dc58.slice - libcontainer container kubepods-besteffort-pod5465ee87_9c89_4df3_960a_e3af6690dc58.slice. Apr 17 02:53:15.698597 kubelet[2815]: I0417 02:53:15.698190 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfw2n\" (UniqueName: \"kubernetes.io/projected/5465ee87-9c89-4df3-960a-e3af6690dc58-kube-api-access-gfw2n\") pod \"whisker-74db899c95-7npzx\" (UID: \"5465ee87-9c89-4df3-960a-e3af6690dc58\") " pod="calico-system/whisker-74db899c95-7npzx" Apr 17 02:53:15.700080 kubelet[2815]: I0417 02:53:15.700014 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/5465ee87-9c89-4df3-960a-e3af6690dc58-nginx-config\") pod \"whisker-74db899c95-7npzx\" (UID: \"5465ee87-9c89-4df3-960a-e3af6690dc58\") " pod="calico-system/whisker-74db899c95-7npzx" Apr 17 02:53:15.700149 kubelet[2815]: I0417 02:53:15.700091 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5465ee87-9c89-4df3-960a-e3af6690dc58-whisker-ca-bundle\") pod \"whisker-74db899c95-7npzx\" (UID: \"5465ee87-9c89-4df3-960a-e3af6690dc58\") " pod="calico-system/whisker-74db899c95-7npzx" Apr 17 02:53:15.723979 kubelet[2815]: I0417 02:53:15.720515 2815 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5465ee87-9c89-4df3-960a-e3af6690dc58-whisker-backend-key-pair\") pod \"whisker-74db899c95-7npzx\" (UID: \"5465ee87-9c89-4df3-960a-e3af6690dc58\") " pod="calico-system/whisker-74db899c95-7npzx" Apr 17 02:53:16.186015 containerd[1591]: time="2026-04-17T02:53:16.185957698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-74db899c95-7npzx,Uid:5465ee87-9c89-4df3-960a-e3af6690dc58,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:16.258799 kubelet[2815]: I0417 02:53:16.258203 2815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6cda6d1-e063-4ab6-9303-448552be0606" path="/var/lib/kubelet/pods/e6cda6d1-e063-4ab6-9303-448552be0606/volumes" Apr 17 02:53:17.166451 containerd[1591]: time="2026-04-17T02:53:17.165666746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vnr2q,Uid:f2c15a0f-c771-44b2-be28-01bbe4cb1c8b,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:19.158620 containerd[1591]: time="2026-04-17T02:53:19.158558572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58fdb9b8fb-4dckf,Uid:359f9669-8513-45dd-ad4a-a69e69851e6f,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:19.178007 kubelet[2815]: E0417 02:53:19.177556 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:19.274054 containerd[1591]: time="2026-04-17T02:53:19.270404352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rjxxk,Uid:8336d134-f31e-409c-b432-def12e532bd8,Namespace:kube-system,Attempt:0,}" Apr 17 02:53:19.484156 systemd-networkd[1476]: calif181cc1c8a9: Link UP Apr 17 02:53:19.493490 systemd-networkd[1476]: calif181cc1c8a9: Gained carrier Apr 17 02:53:19.945303 containerd[1591]: 2026-04-17 02:53:16.456 [ERROR][4171] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 02:53:19.945303 containerd[1591]: 2026-04-17 02:53:16.792 [INFO][4171] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--74db899c95--7npzx-eth0 whisker-74db899c95- calico-system 5465ee87-9c89-4df3-960a-e3af6690dc58 1116 0 2026-04-17 02:53:15 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:74db899c95 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-74db899c95-7npzx eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calif181cc1c8a9 [] [] }} ContainerID="37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" Namespace="calico-system" Pod="whisker-74db899c95-7npzx" WorkloadEndpoint="localhost-k8s-whisker--74db899c95--7npzx-" Apr 17 02:53:19.945303 containerd[1591]: 2026-04-17 02:53:16.792 [INFO][4171] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" Namespace="calico-system" Pod="whisker-74db899c95-7npzx" WorkloadEndpoint="localhost-k8s-whisker--74db899c95--7npzx-eth0" Apr 17 02:53:19.945303 containerd[1591]: 2026-04-17 02:53:17.191 [INFO][4185] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" HandleID="k8s-pod-network.37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" Workload="localhost-k8s-whisker--74db899c95--7npzx-eth0" Apr 17 02:53:19.945664 containerd[1591]: 2026-04-17 02:53:17.473 [INFO][4185] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" HandleID="k8s-pod-network.37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" Workload="localhost-k8s-whisker--74db899c95--7npzx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000502a00), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-74db899c95-7npzx", "timestamp":"2026-04-17 02:53:17.191981315 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00024e160)} Apr 17 02:53:19.945664 containerd[1591]: 2026-04-17 02:53:17.473 [INFO][4185] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 02:53:19.945664 containerd[1591]: 2026-04-17 02:53:17.473 [INFO][4185] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 02:53:19.945664 containerd[1591]: 2026-04-17 02:53:17.473 [INFO][4185] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Apr 17 02:53:19.945664 containerd[1591]: 2026-04-17 02:53:17.544 [INFO][4185] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" host="localhost" Apr 17 02:53:19.945664 containerd[1591]: 2026-04-17 02:53:17.712 [INFO][4185] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Apr 17 02:53:19.945664 containerd[1591]: 2026-04-17 02:53:17.917 [INFO][4185] ipam/ipam.go 558: Ran out of existing affine blocks for host host="localhost" Apr 17 02:53:19.945664 containerd[1591]: 2026-04-17 02:53:18.022 [INFO][4185] ipam/ipam.go 575: Tried all affine blocks. Looking for an affine block with space, or a new unclaimed block host="localhost" Apr 17 02:53:19.945664 containerd[1591]: 2026-04-17 02:53:18.179 [INFO][4185] ipam/ipam_block_reader_writer.go 158: Found free block: 192.168.88.128/26 Apr 17 02:53:19.945664 containerd[1591]: 2026-04-17 02:53:18.180 [INFO][4185] ipam/ipam.go 588: Found unclaimed block in 156.17559ms host="localhost" subnet=192.168.88.128/26 Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.180 [INFO][4185] ipam/ipam_block_reader_writer.go 175: Trying to create affinity in pending state host="localhost" subnet=192.168.88.128/26 Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.370 [INFO][4185] ipam/ipam_block_reader_writer.go 205: Successfully created pending affinity for block host="localhost" subnet=192.168.88.128/26 Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.375 [INFO][4185] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.424 [INFO][4185] ipam/ipam.go 165: The referenced block doesn't exist, trying to create it cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.494 [INFO][4185] ipam/ipam.go 172: Wrote affinity as pending cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.589 [INFO][4185] ipam/ipam.go 181: Attempting to claim the block cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.591 [INFO][4185] ipam/ipam_block_reader_writer.go 226: Attempting to create a new block affinityType="host" host="localhost" subnet=192.168.88.128/26 Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.748 [INFO][4185] ipam/ipam_block_reader_writer.go 267: Successfully created block Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.754 [INFO][4185] ipam/ipam_block_reader_writer.go 283: Confirming affinity host="localhost" subnet=192.168.88.128/26 Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.851 [INFO][4185] ipam/ipam_block_reader_writer.go 298: Successfully confirmed affinity host="localhost" subnet=192.168.88.128/26 Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.852 [INFO][4185] ipam/ipam.go 623: Block '192.168.88.128/26' has 64 free ips which is more than 1 ips required. host="localhost" subnet=192.168.88.128/26 Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.852 [INFO][4185] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" host="localhost" Apr 17 02:53:19.946493 containerd[1591]: 2026-04-17 02:53:18.894 [INFO][4185] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6 Apr 17 02:53:19.950198 containerd[1591]: 2026-04-17 02:53:19.070 [INFO][4185] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" host="localhost" Apr 17 02:53:19.950198 containerd[1591]: 2026-04-17 02:53:19.150 [INFO][4185] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.128/26] block=192.168.88.128/26 handle="k8s-pod-network.37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" host="localhost" Apr 17 02:53:19.950198 containerd[1591]: 2026-04-17 02:53:19.150 [INFO][4185] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.128/26] handle="k8s-pod-network.37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" host="localhost" Apr 17 02:53:19.950198 containerd[1591]: 2026-04-17 02:53:19.150 [INFO][4185] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 02:53:19.950198 containerd[1591]: 2026-04-17 02:53:19.150 [INFO][4185] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.128/26] IPv6=[] ContainerID="37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" HandleID="k8s-pod-network.37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" Workload="localhost-k8s-whisker--74db899c95--7npzx-eth0" Apr 17 02:53:19.950335 containerd[1591]: 2026-04-17 02:53:19.161 [INFO][4171] cni-plugin/k8s.go 418: Populated endpoint ContainerID="37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" Namespace="calico-system" Pod="whisker-74db899c95-7npzx" WorkloadEndpoint="localhost-k8s-whisker--74db899c95--7npzx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--74db899c95--7npzx-eth0", GenerateName:"whisker-74db899c95-", Namespace:"calico-system", SelfLink:"", UID:"5465ee87-9c89-4df3-960a-e3af6690dc58", ResourceVersion:"1116", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 53, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"74db899c95", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-74db899c95-7npzx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.128/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif181cc1c8a9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:19.950335 containerd[1591]: 2026-04-17 02:53:19.295 [INFO][4171] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.128/32] ContainerID="37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" Namespace="calico-system" Pod="whisker-74db899c95-7npzx" WorkloadEndpoint="localhost-k8s-whisker--74db899c95--7npzx-eth0" Apr 17 02:53:19.950471 containerd[1591]: 2026-04-17 02:53:19.347 [INFO][4171] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif181cc1c8a9 ContainerID="37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" Namespace="calico-system" Pod="whisker-74db899c95-7npzx" WorkloadEndpoint="localhost-k8s-whisker--74db899c95--7npzx-eth0" Apr 17 02:53:19.950471 containerd[1591]: 2026-04-17 02:53:19.530 [INFO][4171] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" Namespace="calico-system" Pod="whisker-74db899c95-7npzx" WorkloadEndpoint="localhost-k8s-whisker--74db899c95--7npzx-eth0" Apr 17 02:53:19.950519 containerd[1591]: 2026-04-17 02:53:19.534 [INFO][4171] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" Namespace="calico-system" Pod="whisker-74db899c95-7npzx" WorkloadEndpoint="localhost-k8s-whisker--74db899c95--7npzx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--74db899c95--7npzx-eth0", GenerateName:"whisker-74db899c95-", Namespace:"calico-system", SelfLink:"", UID:"5465ee87-9c89-4df3-960a-e3af6690dc58", ResourceVersion:"1116", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 53, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"74db899c95", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6", Pod:"whisker-74db899c95-7npzx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.128/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif181cc1c8a9", MAC:"6a:0e:f2:21:26:d0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:19.950598 containerd[1591]: 2026-04-17 02:53:19.929 [INFO][4171] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" Namespace="calico-system" Pod="whisker-74db899c95-7npzx" WorkloadEndpoint="localhost-k8s-whisker--74db899c95--7npzx-eth0" Apr 17 02:53:20.624254 systemd-networkd[1476]: calif181cc1c8a9: Gained IPv6LL Apr 17 02:53:20.649789 containerd[1591]: time="2026-04-17T02:53:20.648461894Z" level=info msg="connecting to shim 37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6" address="unix:///run/containerd/s/970383b779f78143a07d32fb252c4d40a8cf3e5ad1ee0493300d7635da384e5c" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:53:21.125064 systemd[1]: Started cri-containerd-37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6.scope - libcontainer container 37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6. Apr 17 02:53:21.240795 containerd[1591]: time="2026-04-17T02:53:21.239637066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fbccc5db4-vvpvq,Uid:e186d35c-95f3-490a-bcc6-ab6d243181fe,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:21.530205 systemd-resolved[1478]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Apr 17 02:53:21.788068 systemd-networkd[1476]: calid8b30c9adb9: Link UP Apr 17 02:53:21.980263 systemd-networkd[1476]: calid8b30c9adb9: Gained carrier Apr 17 02:53:22.011804 containerd[1591]: time="2026-04-17T02:53:22.011307798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-74db899c95-7npzx,Uid:5465ee87-9c89-4df3-960a-e3af6690dc58,Namespace:calico-system,Attempt:0,} returns sandbox id \"37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6\"" Apr 17 02:53:22.043300 containerd[1591]: time="2026-04-17T02:53:22.041335033Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Apr 17 02:53:22.246763 containerd[1591]: time="2026-04-17T02:53:22.246631974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58fdb9b8fb-rtk29,Uid:0877677f-504a-4651-a066-b1c4f2fa0fee,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:22.320093 kubelet[2815]: E0417 02:53:22.290584 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:22.329827 containerd[1591]: time="2026-04-17T02:53:22.329378612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-r6dvt,Uid:9c3706c6-5e02-4b80-b77e-666efd679ffb,Namespace:kube-system,Attempt:0,}" Apr 17 02:53:22.465635 containerd[1591]: time="2026-04-17T02:53:22.464458915Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-7sj6j,Uid:503bddcb-5ae9-4596-93b1-28b74f7a0d4b,Namespace:calico-system,Attempt:0,}" Apr 17 02:53:22.578941 containerd[1591]: 2026-04-17 02:53:17.655 [ERROR][4193] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 02:53:22.578941 containerd[1591]: 2026-04-17 02:53:17.873 [INFO][4193] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--vnr2q-eth0 csi-node-driver- calico-system f2c15a0f-c771-44b2-be28-01bbe4cb1c8b 860 0 2026-04-17 02:52:20 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-vnr2q eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid8b30c9adb9 [] [] }} ContainerID="b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" Namespace="calico-system" Pod="csi-node-driver-vnr2q" WorkloadEndpoint="localhost-k8s-csi--node--driver--vnr2q-" Apr 17 02:53:22.578941 containerd[1591]: 2026-04-17 02:53:17.874 [INFO][4193] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" Namespace="calico-system" Pod="csi-node-driver-vnr2q" WorkloadEndpoint="localhost-k8s-csi--node--driver--vnr2q-eth0" Apr 17 02:53:22.578941 containerd[1591]: 2026-04-17 02:53:18.426 [INFO][4207] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" HandleID="k8s-pod-network.b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" Workload="localhost-k8s-csi--node--driver--vnr2q-eth0" Apr 17 02:53:22.579779 containerd[1591]: 2026-04-17 02:53:18.489 [INFO][4207] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" HandleID="k8s-pod-network.b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" Workload="localhost-k8s-csi--node--driver--vnr2q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277430), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-vnr2q", "timestamp":"2026-04-17 02:53:18.426114061 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000202420)} Apr 17 02:53:22.579779 containerd[1591]: 2026-04-17 02:53:18.493 [INFO][4207] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 02:53:22.579779 containerd[1591]: 2026-04-17 02:53:19.151 [INFO][4207] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 02:53:22.579779 containerd[1591]: 2026-04-17 02:53:19.154 [INFO][4207] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Apr 17 02:53:22.579779 containerd[1591]: 2026-04-17 02:53:19.474 [INFO][4207] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" host="localhost" Apr 17 02:53:22.579779 containerd[1591]: 2026-04-17 02:53:19.956 [INFO][4207] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Apr 17 02:53:22.579779 containerd[1591]: 2026-04-17 02:53:20.242 [INFO][4207] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Apr 17 02:53:22.579779 containerd[1591]: 2026-04-17 02:53:20.600 [INFO][4207] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:22.579779 containerd[1591]: 2026-04-17 02:53:20.827 [INFO][4207] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:22.579779 containerd[1591]: 2026-04-17 02:53:20.882 [INFO][4207] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" host="localhost" Apr 17 02:53:22.580169 containerd[1591]: 2026-04-17 02:53:21.014 [INFO][4207] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120 Apr 17 02:53:22.580169 containerd[1591]: 2026-04-17 02:53:21.158 [INFO][4207] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" host="localhost" Apr 17 02:53:22.580169 containerd[1591]: 2026-04-17 02:53:21.399 [INFO][4207] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" host="localhost" Apr 17 02:53:22.580169 containerd[1591]: 2026-04-17 02:53:21.460 [INFO][4207] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" host="localhost" Apr 17 02:53:22.580169 containerd[1591]: 2026-04-17 02:53:21.479 [INFO][4207] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 02:53:22.580169 containerd[1591]: 2026-04-17 02:53:21.480 [INFO][4207] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" HandleID="k8s-pod-network.b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" Workload="localhost-k8s-csi--node--driver--vnr2q-eth0" Apr 17 02:53:22.580373 containerd[1591]: 2026-04-17 02:53:21.511 [INFO][4193] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" Namespace="calico-system" Pod="csi-node-driver-vnr2q" WorkloadEndpoint="localhost-k8s-csi--node--driver--vnr2q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--vnr2q-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f2c15a0f-c771-44b2-be28-01bbe4cb1c8b", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 52, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-vnr2q", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid8b30c9adb9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:22.580461 containerd[1591]: 2026-04-17 02:53:21.535 [INFO][4193] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" Namespace="calico-system" Pod="csi-node-driver-vnr2q" WorkloadEndpoint="localhost-k8s-csi--node--driver--vnr2q-eth0" Apr 17 02:53:22.580461 containerd[1591]: 2026-04-17 02:53:21.592 [INFO][4193] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid8b30c9adb9 ContainerID="b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" Namespace="calico-system" Pod="csi-node-driver-vnr2q" WorkloadEndpoint="localhost-k8s-csi--node--driver--vnr2q-eth0" Apr 17 02:53:22.580461 containerd[1591]: 2026-04-17 02:53:21.985 [INFO][4193] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" Namespace="calico-system" Pod="csi-node-driver-vnr2q" WorkloadEndpoint="localhost-k8s-csi--node--driver--vnr2q-eth0" Apr 17 02:53:22.580533 containerd[1591]: 2026-04-17 02:53:21.987 [INFO][4193] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" Namespace="calico-system" Pod="csi-node-driver-vnr2q" WorkloadEndpoint="localhost-k8s-csi--node--driver--vnr2q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--vnr2q-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f2c15a0f-c771-44b2-be28-01bbe4cb1c8b", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 52, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120", Pod:"csi-node-driver-vnr2q", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid8b30c9adb9", MAC:"aa:26:e9:f0:37:a6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:22.580610 containerd[1591]: 2026-04-17 02:53:22.568 [INFO][4193] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" Namespace="calico-system" Pod="csi-node-driver-vnr2q" WorkloadEndpoint="localhost-k8s-csi--node--driver--vnr2q-eth0" Apr 17 02:53:23.246451 containerd[1591]: time="2026-04-17T02:53:23.246382373Z" level=info msg="connecting to shim b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120" address="unix:///run/containerd/s/b532ae8565d1bdd48c4efb9fba9cad9475541238e86a7d68e1c2d3386f467f11" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:53:23.622168 systemd-networkd[1476]: calid8b30c9adb9: Gained IPv6LL Apr 17 02:53:23.658165 systemd[1]: Started cri-containerd-b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120.scope - libcontainer container b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120. Apr 17 02:53:23.865992 systemd-networkd[1476]: calid58207f1003: Link UP Apr 17 02:53:23.932322 systemd-networkd[1476]: calid58207f1003: Gained carrier Apr 17 02:53:24.189758 systemd-resolved[1478]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Apr 17 02:53:24.253886 containerd[1591]: 2026-04-17 02:53:20.116 [ERROR][4219] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 02:53:24.253886 containerd[1591]: 2026-04-17 02:53:20.598 [INFO][4219] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--66bc5c9577--rjxxk-eth0 coredns-66bc5c9577- kube-system 8336d134-f31e-409c-b432-def12e532bd8 1052 0 2026-04-17 02:51:34 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-66bc5c9577-rjxxk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid58207f1003 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" Namespace="kube-system" Pod="coredns-66bc5c9577-rjxxk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rjxxk-" Apr 17 02:53:24.253886 containerd[1591]: 2026-04-17 02:53:20.666 [INFO][4219] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" Namespace="kube-system" Pod="coredns-66bc5c9577-rjxxk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rjxxk-eth0" Apr 17 02:53:24.253886 containerd[1591]: 2026-04-17 02:53:21.830 [INFO][4287] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" HandleID="k8s-pod-network.3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" Workload="localhost-k8s-coredns--66bc5c9577--rjxxk-eth0" Apr 17 02:53:24.255580 containerd[1591]: 2026-04-17 02:53:22.084 [INFO][4287] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" HandleID="k8s-pod-network.3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" Workload="localhost-k8s-coredns--66bc5c9577--rjxxk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000430e20), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-66bc5c9577-rjxxk", "timestamp":"2026-04-17 02:53:21.830147097 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000330dc0)} Apr 17 02:53:24.255580 containerd[1591]: 2026-04-17 02:53:22.088 [INFO][4287] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 02:53:24.255580 containerd[1591]: 2026-04-17 02:53:22.097 [INFO][4287] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 02:53:24.255580 containerd[1591]: 2026-04-17 02:53:22.097 [INFO][4287] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Apr 17 02:53:24.255580 containerd[1591]: 2026-04-17 02:53:22.348 [INFO][4287] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" host="localhost" Apr 17 02:53:24.255580 containerd[1591]: 2026-04-17 02:53:22.962 [INFO][4287] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Apr 17 02:53:24.255580 containerd[1591]: 2026-04-17 02:53:23.344 [INFO][4287] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Apr 17 02:53:24.255580 containerd[1591]: 2026-04-17 02:53:23.412 [INFO][4287] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:24.255580 containerd[1591]: 2026-04-17 02:53:23.424 [INFO][4287] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:24.255580 containerd[1591]: 2026-04-17 02:53:23.424 [INFO][4287] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" host="localhost" Apr 17 02:53:24.266387 containerd[1591]: 2026-04-17 02:53:23.439 [INFO][4287] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171 Apr 17 02:53:24.266387 containerd[1591]: 2026-04-17 02:53:23.471 [INFO][4287] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" host="localhost" Apr 17 02:53:24.266387 containerd[1591]: 2026-04-17 02:53:23.699 [INFO][4287] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" host="localhost" Apr 17 02:53:24.266387 containerd[1591]: 2026-04-17 02:53:23.734 [INFO][4287] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" host="localhost" Apr 17 02:53:24.266387 containerd[1591]: 2026-04-17 02:53:23.798 [INFO][4287] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 02:53:24.266387 containerd[1591]: 2026-04-17 02:53:23.799 [INFO][4287] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" HandleID="k8s-pod-network.3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" Workload="localhost-k8s-coredns--66bc5c9577--rjxxk-eth0" Apr 17 02:53:24.266556 containerd[1591]: 2026-04-17 02:53:23.814 [INFO][4219] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" Namespace="kube-system" Pod="coredns-66bc5c9577-rjxxk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rjxxk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--rjxxk-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"8336d134-f31e-409c-b432-def12e532bd8", ResourceVersion:"1052", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 51, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-66bc5c9577-rjxxk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid58207f1003", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:24.266556 containerd[1591]: 2026-04-17 02:53:23.832 [INFO][4219] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" Namespace="kube-system" Pod="coredns-66bc5c9577-rjxxk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rjxxk-eth0" Apr 17 02:53:24.266556 containerd[1591]: 2026-04-17 02:53:23.847 [INFO][4219] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid58207f1003 ContainerID="3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" Namespace="kube-system" Pod="coredns-66bc5c9577-rjxxk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rjxxk-eth0" Apr 17 02:53:24.266556 containerd[1591]: 2026-04-17 02:53:23.934 [INFO][4219] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" Namespace="kube-system" Pod="coredns-66bc5c9577-rjxxk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rjxxk-eth0" Apr 17 02:53:24.266556 containerd[1591]: 2026-04-17 02:53:23.935 [INFO][4219] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" Namespace="kube-system" Pod="coredns-66bc5c9577-rjxxk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rjxxk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--rjxxk-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"8336d134-f31e-409c-b432-def12e532bd8", ResourceVersion:"1052", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 51, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171", Pod:"coredns-66bc5c9577-rjxxk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid58207f1003", MAC:"ce:a3:61:78:ef:d2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:24.266556 containerd[1591]: 2026-04-17 02:53:24.167 [INFO][4219] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" Namespace="kube-system" Pod="coredns-66bc5c9577-rjxxk" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--rjxxk-eth0" Apr 17 02:53:24.554766 containerd[1591]: time="2026-04-17T02:53:24.547458458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vnr2q,Uid:f2c15a0f-c771-44b2-be28-01bbe4cb1c8b,Namespace:calico-system,Attempt:0,} returns sandbox id \"b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120\"" Apr 17 02:53:24.778529 containerd[1591]: time="2026-04-17T02:53:24.777679255Z" level=info msg="connecting to shim 3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171" address="unix:///run/containerd/s/2e133659af84c9176652cc9a837c4d9d60b0c4aca8cfe91abb2a7b8cadd6e805" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:53:24.931572 systemd[1]: Started cri-containerd-3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171.scope - libcontainer container 3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171. Apr 17 02:53:25.055453 systemd-resolved[1478]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Apr 17 02:53:25.343384 containerd[1591]: time="2026-04-17T02:53:25.343303896Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-rjxxk,Uid:8336d134-f31e-409c-b432-def12e532bd8,Namespace:kube-system,Attempt:0,} returns sandbox id \"3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171\"" Apr 17 02:53:25.357844 kubelet[2815]: E0417 02:53:25.357067 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:25.482762 containerd[1591]: time="2026-04-17T02:53:25.481811827Z" level=info msg="CreateContainer within sandbox \"3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 17 02:53:25.532804 containerd[1591]: time="2026-04-17T02:53:25.532458421Z" level=info msg="Container 0805987caa92c3406b97c07c809aaa395f59fb3fa49a670f129b63c9bfbb4e31: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:53:25.573490 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4024499064.mount: Deactivated successfully. Apr 17 02:53:25.649841 containerd[1591]: time="2026-04-17T02:53:25.649064635Z" level=info msg="CreateContainer within sandbox \"3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0805987caa92c3406b97c07c809aaa395f59fb3fa49a670f129b63c9bfbb4e31\"" Apr 17 02:53:25.675121 systemd-networkd[1476]: calid58207f1003: Gained IPv6LL Apr 17 02:53:25.680625 containerd[1591]: time="2026-04-17T02:53:25.680510080Z" level=info msg="StartContainer for \"0805987caa92c3406b97c07c809aaa395f59fb3fa49a670f129b63c9bfbb4e31\"" Apr 17 02:53:25.733098 containerd[1591]: time="2026-04-17T02:53:25.733031288Z" level=info msg="connecting to shim 0805987caa92c3406b97c07c809aaa395f59fb3fa49a670f129b63c9bfbb4e31" address="unix:///run/containerd/s/2e133659af84c9176652cc9a837c4d9d60b0c4aca8cfe91abb2a7b8cadd6e805" protocol=ttrpc version=3 Apr 17 02:53:25.786045 systemd[1]: Started cri-containerd-0805987caa92c3406b97c07c809aaa395f59fb3fa49a670f129b63c9bfbb4e31.scope - libcontainer container 0805987caa92c3406b97c07c809aaa395f59fb3fa49a670f129b63c9bfbb4e31. Apr 17 02:53:26.192677 containerd[1591]: time="2026-04-17T02:53:26.192352190Z" level=info msg="StartContainer for \"0805987caa92c3406b97c07c809aaa395f59fb3fa49a670f129b63c9bfbb4e31\" returns successfully" Apr 17 02:53:26.533411 kubelet[2815]: E0417 02:53:26.532984 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:26.888798 containerd[1591]: time="2026-04-17T02:53:26.886323904Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:26.897605 containerd[1591]: time="2026-04-17T02:53:26.897495086Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Apr 17 02:53:27.063751 containerd[1591]: time="2026-04-17T02:53:27.055596375Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:27.063751 containerd[1591]: time="2026-04-17T02:53:27.061223047Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:27.063751 containerd[1591]: time="2026-04-17T02:53:27.063689849Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 5.01964314s" Apr 17 02:53:27.063751 containerd[1591]: time="2026-04-17T02:53:27.063759916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Apr 17 02:53:27.072771 containerd[1591]: time="2026-04-17T02:53:27.072094250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Apr 17 02:53:27.089009 systemd-networkd[1476]: calibb37156acee: Link UP Apr 17 02:53:27.092806 systemd-networkd[1476]: calibb37156acee: Gained carrier Apr 17 02:53:27.097764 containerd[1591]: time="2026-04-17T02:53:27.097390842Z" level=info msg="CreateContainer within sandbox \"37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Apr 17 02:53:27.135800 containerd[1591]: time="2026-04-17T02:53:27.133010625Z" level=info msg="Container 91013b0a19cb7668c0b5d594d112fcfb795d6ac4f3e0af0af0720b1557c09137: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:53:27.270560 containerd[1591]: time="2026-04-17T02:53:27.270483354Z" level=info msg="CreateContainer within sandbox \"37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"91013b0a19cb7668c0b5d594d112fcfb795d6ac4f3e0af0af0720b1557c09137\"" Apr 17 02:53:27.283699 containerd[1591]: time="2026-04-17T02:53:27.281535311Z" level=info msg="StartContainer for \"91013b0a19cb7668c0b5d594d112fcfb795d6ac4f3e0af0af0720b1557c09137\"" Apr 17 02:53:27.292560 containerd[1591]: time="2026-04-17T02:53:27.292514635Z" level=info msg="connecting to shim 91013b0a19cb7668c0b5d594d112fcfb795d6ac4f3e0af0af0720b1557c09137" address="unix:///run/containerd/s/970383b779f78143a07d32fb252c4d40a8cf3e5ad1ee0493300d7635da384e5c" protocol=ttrpc version=3 Apr 17 02:53:27.381274 kubelet[2815]: I0417 02:53:27.374872 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-rjxxk" podStartSLOduration=113.374810495 podStartE2EDuration="1m53.374810495s" podCreationTimestamp="2026-04-17 02:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 02:53:26.963644551 +0000 UTC m=+117.113591655" watchObservedRunningTime="2026-04-17 02:53:27.374810495 +0000 UTC m=+117.524757611" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:20.121 [ERROR][4217] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:20.836 [INFO][4217] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0 calico-apiserver-58fdb9b8fb- calico-system 359f9669-8513-45dd-ad4a-a69e69851e6f 1043 0 2026-04-17 02:52:13 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:58fdb9b8fb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-58fdb9b8fb-4dckf eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calibb37156acee [] [] }} ContainerID="d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-4dckf" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:20.842 [INFO][4217] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-4dckf" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:21.988 [INFO][4285] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" HandleID="k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" Workload="localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:22.489 [INFO][4285] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" HandleID="k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" Workload="localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000361d80), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-apiserver-58fdb9b8fb-4dckf", "timestamp":"2026-04-17 02:53:21.988498604 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001982c0)} Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:22.490 [INFO][4285] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:23.736 [INFO][4285] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:23.781 [INFO][4285] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:24.049 [INFO][4285] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:24.330 [INFO][4285] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:24.700 [INFO][4285] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:24.757 [INFO][4285] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:24.907 [INFO][4285] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:24.912 [INFO][4285] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:25.036 [INFO][4285] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882 Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:25.133 [INFO][4285] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:25.355 [INFO][4285] ipam/ipam.go 1276: Failed to update block block=192.168.88.128/26 error=update conflict: IPAMBlock(192-168-88-128-26) handle="k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:26.110 [INFO][4285] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:26.163 [INFO][4285] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882 Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:26.379 [INFO][4285] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:26.699 [INFO][4285] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:26.700 [INFO][4285] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" host="localhost" Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:26.768 [INFO][4285] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 02:53:27.471628 containerd[1591]: 2026-04-17 02:53:26.788 [INFO][4285] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" HandleID="k8s-pod-network.d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" Workload="localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0" Apr 17 02:53:27.474672 containerd[1591]: 2026-04-17 02:53:26.869 [INFO][4217] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-4dckf" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0", GenerateName:"calico-apiserver-58fdb9b8fb-", Namespace:"calico-system", SelfLink:"", UID:"359f9669-8513-45dd-ad4a-a69e69851e6f", ResourceVersion:"1043", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 52, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58fdb9b8fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-58fdb9b8fb-4dckf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calibb37156acee", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:27.474672 containerd[1591]: 2026-04-17 02:53:26.878 [INFO][4217] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-4dckf" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0" Apr 17 02:53:27.474672 containerd[1591]: 2026-04-17 02:53:26.993 [INFO][4217] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibb37156acee ContainerID="d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-4dckf" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0" Apr 17 02:53:27.474672 containerd[1591]: 2026-04-17 02:53:27.112 [INFO][4217] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-4dckf" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0" Apr 17 02:53:27.474672 containerd[1591]: 2026-04-17 02:53:27.147 [INFO][4217] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-4dckf" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0", GenerateName:"calico-apiserver-58fdb9b8fb-", Namespace:"calico-system", SelfLink:"", UID:"359f9669-8513-45dd-ad4a-a69e69851e6f", ResourceVersion:"1043", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 52, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58fdb9b8fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882", Pod:"calico-apiserver-58fdb9b8fb-4dckf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calibb37156acee", MAC:"b2:52:60:01:85:46", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:27.474672 containerd[1591]: 2026-04-17 02:53:27.465 [INFO][4217] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-4dckf" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--4dckf-eth0" Apr 17 02:53:27.514517 systemd[1]: Started cri-containerd-91013b0a19cb7668c0b5d594d112fcfb795d6ac4f3e0af0af0720b1557c09137.scope - libcontainer container 91013b0a19cb7668c0b5d594d112fcfb795d6ac4f3e0af0af0720b1557c09137. Apr 17 02:53:27.526535 kubelet[2815]: E0417 02:53:27.526312 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:27.772179 containerd[1591]: time="2026-04-17T02:53:27.772092002Z" level=info msg="connecting to shim d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882" address="unix:///run/containerd/s/4c79e61c3fda3fda5f220e1876d35c5e6afd6047a4aa020e502ea5ae00f6fd39" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:53:28.028887 systemd[1]: Started cri-containerd-d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882.scope - libcontainer container d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882. Apr 17 02:53:28.170594 systemd-resolved[1478]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Apr 17 02:53:28.488361 systemd-networkd[1476]: calic3fc784aaf3: Link UP Apr 17 02:53:28.493059 systemd-networkd[1476]: calic3fc784aaf3: Gained carrier Apr 17 02:53:28.541787 kubelet[2815]: E0417 02:53:28.541177 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:28.650849 containerd[1591]: time="2026-04-17T02:53:28.650663080Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58fdb9b8fb-4dckf,Uid:359f9669-8513-45dd-ad4a-a69e69851e6f,Namespace:calico-system,Attempt:0,} returns sandbox id \"d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882\"" Apr 17 02:53:28.799870 containerd[1591]: time="2026-04-17T02:53:28.794703346Z" level=info msg="StartContainer for \"91013b0a19cb7668c0b5d594d112fcfb795d6ac4f3e0af0af0720b1557c09137\" returns successfully" Apr 17 02:53:28.844015 systemd-networkd[1476]: calibb37156acee: Gained IPv6LL Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:22.384 [ERROR][4302] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:22.976 [INFO][4302] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0 calico-kube-controllers-6fbccc5db4- calico-system e186d35c-95f3-490a-bcc6-ab6d243181fe 1059 0 2026-04-17 02:52:20 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6fbccc5db4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-6fbccc5db4-vvpvq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic3fc784aaf3 [] [] }} ContainerID="a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" Namespace="calico-system" Pod="calico-kube-controllers-6fbccc5db4-vvpvq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:23.082 [INFO][4302] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" Namespace="calico-system" Pod="calico-kube-controllers-6fbccc5db4-vvpvq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:24.078 [INFO][4410] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" HandleID="k8s-pod-network.a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" Workload="localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:24.187 [INFO][4410] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" HandleID="k8s-pod-network.a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" Workload="localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000be580), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-6fbccc5db4-vvpvq", "timestamp":"2026-04-17 02:53:24.078672485 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000334000)} Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:24.250 [INFO][4410] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:26.775 [INFO][4410] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:26.776 [INFO][4410] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:26.954 [INFO][4410] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" host="localhost" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:27.257 [INFO][4410] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:27.503 [INFO][4410] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:27.551 [INFO][4410] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:27.749 [INFO][4410] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:27.800 [INFO][4410] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" host="localhost" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:27.999 [INFO][4410] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386 Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:28.168 [INFO][4410] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" host="localhost" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:28.328 [INFO][4410] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" host="localhost" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:28.329 [INFO][4410] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" host="localhost" Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:28.329 [INFO][4410] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 02:53:28.940860 containerd[1591]: 2026-04-17 02:53:28.329 [INFO][4410] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" HandleID="k8s-pod-network.a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" Workload="localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0" Apr 17 02:53:28.941582 containerd[1591]: 2026-04-17 02:53:28.347 [INFO][4302] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" Namespace="calico-system" Pod="calico-kube-controllers-6fbccc5db4-vvpvq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0", GenerateName:"calico-kube-controllers-6fbccc5db4-", Namespace:"calico-system", SelfLink:"", UID:"e186d35c-95f3-490a-bcc6-ab6d243181fe", ResourceVersion:"1059", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 52, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6fbccc5db4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-6fbccc5db4-vvpvq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic3fc784aaf3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:28.941582 containerd[1591]: 2026-04-17 02:53:28.355 [INFO][4302] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" Namespace="calico-system" Pod="calico-kube-controllers-6fbccc5db4-vvpvq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0" Apr 17 02:53:28.941582 containerd[1591]: 2026-04-17 02:53:28.378 [INFO][4302] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic3fc784aaf3 ContainerID="a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" Namespace="calico-system" Pod="calico-kube-controllers-6fbccc5db4-vvpvq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0" Apr 17 02:53:28.941582 containerd[1591]: 2026-04-17 02:53:28.567 [INFO][4302] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" Namespace="calico-system" Pod="calico-kube-controllers-6fbccc5db4-vvpvq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0" Apr 17 02:53:28.941582 containerd[1591]: 2026-04-17 02:53:28.584 [INFO][4302] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" Namespace="calico-system" Pod="calico-kube-controllers-6fbccc5db4-vvpvq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0", GenerateName:"calico-kube-controllers-6fbccc5db4-", Namespace:"calico-system", SelfLink:"", UID:"e186d35c-95f3-490a-bcc6-ab6d243181fe", ResourceVersion:"1059", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 52, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6fbccc5db4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386", Pod:"calico-kube-controllers-6fbccc5db4-vvpvq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic3fc784aaf3", MAC:"f6:a3:e6:4a:4d:59", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:28.941582 containerd[1591]: 2026-04-17 02:53:28.887 [INFO][4302] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" Namespace="calico-system" Pod="calico-kube-controllers-6fbccc5db4-vvpvq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6fbccc5db4--vvpvq-eth0" Apr 17 02:53:29.093006 containerd[1591]: time="2026-04-17T02:53:29.092520764Z" level=info msg="connecting to shim a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386" address="unix:///run/containerd/s/7dfe8bc3196bdee86ba37ced9b97342e6ac166348e6087c465addc33eb84bf00" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:53:29.428176 systemd[1]: Started cri-containerd-a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386.scope - libcontainer container a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386. Apr 17 02:53:29.697807 kubelet[2815]: E0417 02:53:29.697381 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:29.763303 systemd-resolved[1478]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Apr 17 02:53:30.058512 systemd-networkd[1476]: calic3fc784aaf3: Gained IPv6LL Apr 17 02:53:30.133180 systemd-networkd[1476]: calibc11349f878: Link UP Apr 17 02:53:30.139339 systemd-networkd[1476]: calibc11349f878: Gained carrier Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:22.854 [ERROR][4359] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:23.411 [INFO][4359] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0 goldmane-cccfbd5cf- calico-system 503bddcb-5ae9-4596-93b1-28b74f7a0d4b 1049 0 2026-04-17 02:52:14 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-cccfbd5cf-7sj6j eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calibc11349f878 [] [] }} ContainerID="312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7sj6j" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--7sj6j-" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:23.412 [INFO][4359] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7sj6j" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:24.063 [INFO][4416] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" HandleID="k8s-pod-network.312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" Workload="localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:24.388 [INFO][4416] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" HandleID="k8s-pod-network.312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" Workload="localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000bed20), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-cccfbd5cf-7sj6j", "timestamp":"2026-04-17 02:53:24.063660467 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000680580)} Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:24.496 [INFO][4416] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:28.342 [INFO][4416] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:28.345 [INFO][4416] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:28.589 [INFO][4416] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" host="localhost" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:28.765 [INFO][4416] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:29.063 [INFO][4416] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:29.138 [INFO][4416] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:29.247 [INFO][4416] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:29.248 [INFO][4416] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" host="localhost" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:29.310 [INFO][4416] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:29.563 [INFO][4416] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" host="localhost" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:29.895 [INFO][4416] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" host="localhost" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:29.920 [INFO][4416] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" host="localhost" Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:29.921 [INFO][4416] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 02:53:30.650198 containerd[1591]: 2026-04-17 02:53:29.927 [INFO][4416] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" HandleID="k8s-pod-network.312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" Workload="localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0" Apr 17 02:53:30.665421 containerd[1591]: 2026-04-17 02:53:29.956 [INFO][4359] cni-plugin/k8s.go 418: Populated endpoint ContainerID="312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7sj6j" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"503bddcb-5ae9-4596-93b1-28b74f7a0d4b", ResourceVersion:"1049", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 52, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-cccfbd5cf-7sj6j", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calibc11349f878", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:30.665421 containerd[1591]: 2026-04-17 02:53:29.956 [INFO][4359] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7sj6j" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0" Apr 17 02:53:30.665421 containerd[1591]: 2026-04-17 02:53:29.956 [INFO][4359] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibc11349f878 ContainerID="312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7sj6j" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0" Apr 17 02:53:30.665421 containerd[1591]: 2026-04-17 02:53:30.140 [INFO][4359] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7sj6j" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0" Apr 17 02:53:30.665421 containerd[1591]: 2026-04-17 02:53:30.147 [INFO][4359] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7sj6j" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"503bddcb-5ae9-4596-93b1-28b74f7a0d4b", ResourceVersion:"1049", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 52, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a", Pod:"goldmane-cccfbd5cf-7sj6j", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calibc11349f878", MAC:"1a:dd:14:ff:f6:3a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:30.665421 containerd[1591]: 2026-04-17 02:53:30.570 [INFO][4359] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" Namespace="calico-system" Pod="goldmane-cccfbd5cf-7sj6j" WorkloadEndpoint="localhost-k8s-goldmane--cccfbd5cf--7sj6j-eth0" Apr 17 02:53:30.962770 containerd[1591]: time="2026-04-17T02:53:30.962543648Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fbccc5db4-vvpvq,Uid:e186d35c-95f3-490a-bcc6-ab6d243181fe,Namespace:calico-system,Attempt:0,} returns sandbox id \"a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386\"" Apr 17 02:53:31.234599 containerd[1591]: time="2026-04-17T02:53:31.234552686Z" level=info msg="connecting to shim 312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a" address="unix:///run/containerd/s/1ca422bf56111f01482b669136e2c4b7faae00baabf7c1977b1b9e774a80444c" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:53:31.460114 systemd[1]: Started cri-containerd-312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a.scope - libcontainer container 312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a. Apr 17 02:53:31.662546 systemd-resolved[1478]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Apr 17 02:53:31.941255 systemd-networkd[1476]: calibc11349f878: Gained IPv6LL Apr 17 02:53:32.236531 containerd[1591]: time="2026-04-17T02:53:32.235669942Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:32.239255 containerd[1591]: time="2026-04-17T02:53:32.238858156Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Apr 17 02:53:32.296031 containerd[1591]: time="2026-04-17T02:53:32.295744618Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:32.331622 containerd[1591]: time="2026-04-17T02:53:32.331298096Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:32.368938 containerd[1591]: time="2026-04-17T02:53:32.368784585Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 5.29641327s" Apr 17 02:53:32.368938 containerd[1591]: time="2026-04-17T02:53:32.368891109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Apr 17 02:53:32.375892 containerd[1591]: time="2026-04-17T02:53:32.375339503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 17 02:53:32.448774 containerd[1591]: time="2026-04-17T02:53:32.448224888Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-7sj6j,Uid:503bddcb-5ae9-4596-93b1-28b74f7a0d4b,Namespace:calico-system,Attempt:0,} returns sandbox id \"312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a\"" Apr 17 02:53:32.474908 containerd[1591]: time="2026-04-17T02:53:32.474706046Z" level=info msg="CreateContainer within sandbox \"b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Apr 17 02:53:32.647912 containerd[1591]: time="2026-04-17T02:53:32.647030051Z" level=info msg="Container 127766f7e73f9a7ee6b18163f5ffe8054d411a2081ffec76be3d9ed67d04241f: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:53:32.855230 systemd-networkd[1476]: calie3b2846b487: Link UP Apr 17 02:53:32.859867 containerd[1591]: time="2026-04-17T02:53:32.855694805Z" level=info msg="CreateContainer within sandbox \"b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"127766f7e73f9a7ee6b18163f5ffe8054d411a2081ffec76be3d9ed67d04241f\"" Apr 17 02:53:32.858430 systemd-networkd[1476]: calie3b2846b487: Gained carrier Apr 17 02:53:32.882290 containerd[1591]: time="2026-04-17T02:53:32.881269137Z" level=info msg="StartContainer for \"127766f7e73f9a7ee6b18163f5ffe8054d411a2081ffec76be3d9ed67d04241f\"" Apr 17 02:53:32.957648 containerd[1591]: time="2026-04-17T02:53:32.957172212Z" level=info msg="connecting to shim 127766f7e73f9a7ee6b18163f5ffe8054d411a2081ffec76be3d9ed67d04241f" address="unix:///run/containerd/s/b532ae8565d1bdd48c4efb9fba9cad9475541238e86a7d68e1c2d3386f467f11" protocol=ttrpc version=3 Apr 17 02:53:33.064889 systemd[1]: Started cri-containerd-127766f7e73f9a7ee6b18163f5ffe8054d411a2081ffec76be3d9ed67d04241f.scope - libcontainer container 127766f7e73f9a7ee6b18163f5ffe8054d411a2081ffec76be3d9ed67d04241f. Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:23.109 [ERROR][4347] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:23.615 [INFO][4347] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0 calico-apiserver-58fdb9b8fb- calico-system 0877677f-504a-4651-a066-b1c4f2fa0fee 1061 0 2026-04-17 02:52:13 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:58fdb9b8fb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-58fdb9b8fb-rtk29 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calie3b2846b487 [] [] }} ContainerID="7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-rtk29" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:23.617 [INFO][4347] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-rtk29" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:24.522 [INFO][4445] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" HandleID="k8s-pod-network.7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" Workload="localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:24.741 [INFO][4445] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" HandleID="k8s-pod-network.7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" Workload="localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004efc0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-apiserver-58fdb9b8fb-rtk29", "timestamp":"2026-04-17 02:53:24.522049406 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00038f4a0)} Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:24.742 [INFO][4445] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:29.921 [INFO][4445] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:29.922 [INFO][4445] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:30.140 [INFO][4445] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" host="localhost" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:30.956 [INFO][4445] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:31.466 [INFO][4445] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:31.595 [INFO][4445] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:31.764 [INFO][4445] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:31.770 [INFO][4445] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" host="localhost" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:31.846 [INFO][4445] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35 Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:32.056 [INFO][4445] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" host="localhost" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:32.451 [INFO][4445] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" host="localhost" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:32.455 [INFO][4445] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" host="localhost" Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:32.457 [INFO][4445] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 02:53:33.162693 containerd[1591]: 2026-04-17 02:53:32.457 [INFO][4445] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" HandleID="k8s-pod-network.7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" Workload="localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0" Apr 17 02:53:33.163700 containerd[1591]: 2026-04-17 02:53:32.478 [INFO][4347] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-rtk29" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0", GenerateName:"calico-apiserver-58fdb9b8fb-", Namespace:"calico-system", SelfLink:"", UID:"0877677f-504a-4651-a066-b1c4f2fa0fee", ResourceVersion:"1061", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 52, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58fdb9b8fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-58fdb9b8fb-rtk29", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calie3b2846b487", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:33.163700 containerd[1591]: 2026-04-17 02:53:32.581 [INFO][4347] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-rtk29" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0" Apr 17 02:53:33.163700 containerd[1591]: 2026-04-17 02:53:32.582 [INFO][4347] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie3b2846b487 ContainerID="7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-rtk29" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0" Apr 17 02:53:33.163700 containerd[1591]: 2026-04-17 02:53:32.861 [INFO][4347] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-rtk29" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0" Apr 17 02:53:33.163700 containerd[1591]: 2026-04-17 02:53:32.865 [INFO][4347] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-rtk29" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0", GenerateName:"calico-apiserver-58fdb9b8fb-", Namespace:"calico-system", SelfLink:"", UID:"0877677f-504a-4651-a066-b1c4f2fa0fee", ResourceVersion:"1061", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 52, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58fdb9b8fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35", Pod:"calico-apiserver-58fdb9b8fb-rtk29", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calie3b2846b487", MAC:"5a:d8:8b:1d:f0:cf", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:33.163700 containerd[1591]: 2026-04-17 02:53:33.152 [INFO][4347] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" Namespace="calico-system" Pod="calico-apiserver-58fdb9b8fb-rtk29" WorkloadEndpoint="localhost-k8s-calico--apiserver--58fdb9b8fb--rtk29-eth0" Apr 17 02:53:33.471788 containerd[1591]: time="2026-04-17T02:53:33.471694080Z" level=info msg="connecting to shim 7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35" address="unix:///run/containerd/s/35208740b07d8d72aa6c45261143899fd07fd6a02b08b4631dbd4136bfccce80" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:53:33.777622 systemd[1]: Started cri-containerd-7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35.scope - libcontainer container 7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35. Apr 17 02:53:33.932307 systemd-resolved[1478]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Apr 17 02:53:34.133921 systemd-networkd[1476]: calie3b2846b487: Gained IPv6LL Apr 17 02:53:34.390750 containerd[1591]: time="2026-04-17T02:53:34.389262696Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58fdb9b8fb-rtk29,Uid:0877677f-504a-4651-a066-b1c4f2fa0fee,Namespace:calico-system,Attempt:0,} returns sandbox id \"7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35\"" Apr 17 02:53:34.433213 containerd[1591]: time="2026-04-17T02:53:34.433065648Z" level=info msg="StartContainer for \"127766f7e73f9a7ee6b18163f5ffe8054d411a2081ffec76be3d9ed67d04241f\" returns successfully" Apr 17 02:53:35.019640 systemd-networkd[1476]: calib48e4090ee0: Link UP Apr 17 02:53:35.027393 systemd-networkd[1476]: calib48e4090ee0: Gained carrier Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:23.240 [ERROR][4332] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:23.565 [INFO][4332] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--66bc5c9577--r6dvt-eth0 coredns-66bc5c9577- kube-system 9c3706c6-5e02-4b80-b77e-666efd679ffb 1064 0 2026-04-17 02:51:34 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-66bc5c9577-r6dvt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib48e4090ee0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" Namespace="kube-system" Pod="coredns-66bc5c9577-r6dvt" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--r6dvt-" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:23.569 [INFO][4332] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" Namespace="kube-system" Pod="coredns-66bc5c9577-r6dvt" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--r6dvt-eth0" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:24.700 [INFO][4443] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" HandleID="k8s-pod-network.df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" Workload="localhost-k8s-coredns--66bc5c9577--r6dvt-eth0" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:24.759 [INFO][4443] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" HandleID="k8s-pod-network.df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" Workload="localhost-k8s-coredns--66bc5c9577--r6dvt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000587b10), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-66bc5c9577-r6dvt", "timestamp":"2026-04-17 02:53:24.700587019 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000140c60)} Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:24.759 [INFO][4443] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:32.455 [INFO][4443] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:32.455 [INFO][4443] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:32.638 [INFO][4443] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" host="localhost" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:33.149 [INFO][4443] ipam/ipam.go 409: Looking up existing affinities for host host="localhost" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:33.579 [INFO][4443] ipam/ipam.go 526: Trying affinity for 192.168.88.128/26 host="localhost" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:33.752 [INFO][4443] ipam/ipam.go 160: Attempting to load block cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:33.945 [INFO][4443] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:33.945 [INFO][4443] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" host="localhost" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:34.081 [INFO][4443] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1 Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:34.467 [INFO][4443] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" host="localhost" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:34.971 [INFO][4443] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" host="localhost" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:34.973 [INFO][4443] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" host="localhost" Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:34.974 [INFO][4443] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 02:53:35.479061 containerd[1591]: 2026-04-17 02:53:34.976 [INFO][4443] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" HandleID="k8s-pod-network.df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" Workload="localhost-k8s-coredns--66bc5c9577--r6dvt-eth0" Apr 17 02:53:35.480391 containerd[1591]: 2026-04-17 02:53:34.989 [INFO][4332] cni-plugin/k8s.go 418: Populated endpoint ContainerID="df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" Namespace="kube-system" Pod="coredns-66bc5c9577-r6dvt" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--r6dvt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--r6dvt-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"9c3706c6-5e02-4b80-b77e-666efd679ffb", ResourceVersion:"1064", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 51, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-66bc5c9577-r6dvt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib48e4090ee0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:35.480391 containerd[1591]: 2026-04-17 02:53:34.990 [INFO][4332] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" Namespace="kube-system" Pod="coredns-66bc5c9577-r6dvt" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--r6dvt-eth0" Apr 17 02:53:35.480391 containerd[1591]: 2026-04-17 02:53:34.994 [INFO][4332] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib48e4090ee0 ContainerID="df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" Namespace="kube-system" Pod="coredns-66bc5c9577-r6dvt" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--r6dvt-eth0" Apr 17 02:53:35.480391 containerd[1591]: 2026-04-17 02:53:35.041 [INFO][4332] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" Namespace="kube-system" Pod="coredns-66bc5c9577-r6dvt" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--r6dvt-eth0" Apr 17 02:53:35.480391 containerd[1591]: 2026-04-17 02:53:35.043 [INFO][4332] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" Namespace="kube-system" Pod="coredns-66bc5c9577-r6dvt" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--r6dvt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--66bc5c9577--r6dvt-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"9c3706c6-5e02-4b80-b77e-666efd679ffb", ResourceVersion:"1064", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 2, 51, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1", Pod:"coredns-66bc5c9577-r6dvt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib48e4090ee0", MAC:"e6:79:31:53:63:df", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 02:53:35.480391 containerd[1591]: 2026-04-17 02:53:35.467 [INFO][4332] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" Namespace="kube-system" Pod="coredns-66bc5c9577-r6dvt" WorkloadEndpoint="localhost-k8s-coredns--66bc5c9577--r6dvt-eth0" Apr 17 02:53:35.580763 containerd[1591]: time="2026-04-17T02:53:35.580330473Z" level=info msg="connecting to shim df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1" address="unix:///run/containerd/s/b8daad36c035c58c349b38349f02f62814647022dbb7294e1cf3215f7a95fceb" namespace=k8s.io protocol=ttrpc version=3 Apr 17 02:53:35.795302 systemd[1]: Started cri-containerd-df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1.scope - libcontainer container df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1. Apr 17 02:53:35.999374 systemd-resolved[1478]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Apr 17 02:53:36.248397 containerd[1591]: time="2026-04-17T02:53:36.248293050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-r6dvt,Uid:9c3706c6-5e02-4b80-b77e-666efd679ffb,Namespace:kube-system,Attempt:0,} returns sandbox id \"df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1\"" Apr 17 02:53:36.296210 systemd-networkd[1476]: calib48e4090ee0: Gained IPv6LL Apr 17 02:53:36.319827 kubelet[2815]: E0417 02:53:36.318498 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:36.533122 containerd[1591]: time="2026-04-17T02:53:36.532542921Z" level=info msg="CreateContainer within sandbox \"df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 17 02:53:36.661857 containerd[1591]: time="2026-04-17T02:53:36.660479932Z" level=info msg="Container 63758c5993624e434898bf078e42ae9179353479b9a8eb53cd8fd43a331cc3b9: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:53:36.724839 containerd[1591]: time="2026-04-17T02:53:36.724478990Z" level=info msg="CreateContainer within sandbox \"df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"63758c5993624e434898bf078e42ae9179353479b9a8eb53cd8fd43a331cc3b9\"" Apr 17 02:53:36.755946 containerd[1591]: time="2026-04-17T02:53:36.755505505Z" level=info msg="StartContainer for \"63758c5993624e434898bf078e42ae9179353479b9a8eb53cd8fd43a331cc3b9\"" Apr 17 02:53:36.856363 containerd[1591]: time="2026-04-17T02:53:36.855641612Z" level=info msg="connecting to shim 63758c5993624e434898bf078e42ae9179353479b9a8eb53cd8fd43a331cc3b9" address="unix:///run/containerd/s/b8daad36c035c58c349b38349f02f62814647022dbb7294e1cf3215f7a95fceb" protocol=ttrpc version=3 Apr 17 02:53:36.954670 systemd[1]: Started cri-containerd-63758c5993624e434898bf078e42ae9179353479b9a8eb53cd8fd43a331cc3b9.scope - libcontainer container 63758c5993624e434898bf078e42ae9179353479b9a8eb53cd8fd43a331cc3b9. Apr 17 02:53:37.287797 containerd[1591]: time="2026-04-17T02:53:37.284475424Z" level=info msg="StartContainer for \"63758c5993624e434898bf078e42ae9179353479b9a8eb53cd8fd43a331cc3b9\" returns successfully" Apr 17 02:53:37.861858 kubelet[2815]: E0417 02:53:37.859423 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:38.278423 kubelet[2815]: I0417 02:53:38.251656 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-r6dvt" podStartSLOduration=124.251591525 podStartE2EDuration="2m4.251591525s" podCreationTimestamp="2026-04-17 02:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 02:53:38.25025843 +0000 UTC m=+128.400205542" watchObservedRunningTime="2026-04-17 02:53:38.251591525 +0000 UTC m=+128.401538645" Apr 17 02:53:38.332363 kubelet[2815]: E0417 02:53:38.331519 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:38.879952 kubelet[2815]: E0417 02:53:38.879834 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:39.956020 kubelet[2815]: E0417 02:53:39.955992 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:41.934076 containerd[1591]: time="2026-04-17T02:53:41.933931105Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:41.939695 containerd[1591]: time="2026-04-17T02:53:41.939635301Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Apr 17 02:53:41.948390 containerd[1591]: time="2026-04-17T02:53:41.948334259Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:41.985855 containerd[1591]: time="2026-04-17T02:53:41.985379874Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:41.989619 containerd[1591]: time="2026-04-17T02:53:41.989508175Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 9.614000271s" Apr 17 02:53:41.990214 containerd[1591]: time="2026-04-17T02:53:41.989628094Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Apr 17 02:53:41.995384 containerd[1591]: time="2026-04-17T02:53:41.995273271Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Apr 17 02:53:42.079795 containerd[1591]: time="2026-04-17T02:53:42.079677197Z" level=info msg="CreateContainer within sandbox \"d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 17 02:53:42.146222 containerd[1591]: time="2026-04-17T02:53:42.146157880Z" level=info msg="Container 1ac3f8e05ce19396356f3e05035ba22fd7eaa5bb47a498e2720dd45da6561f82: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:53:42.230785 containerd[1591]: time="2026-04-17T02:53:42.230677905Z" level=info msg="CreateContainer within sandbox \"d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1ac3f8e05ce19396356f3e05035ba22fd7eaa5bb47a498e2720dd45da6561f82\"" Apr 17 02:53:42.249538 containerd[1591]: time="2026-04-17T02:53:42.248023148Z" level=info msg="StartContainer for \"1ac3f8e05ce19396356f3e05035ba22fd7eaa5bb47a498e2720dd45da6561f82\"" Apr 17 02:53:42.341131 containerd[1591]: time="2026-04-17T02:53:42.340854685Z" level=info msg="connecting to shim 1ac3f8e05ce19396356f3e05035ba22fd7eaa5bb47a498e2720dd45da6561f82" address="unix:///run/containerd/s/4c79e61c3fda3fda5f220e1876d35c5e6afd6047a4aa020e502ea5ae00f6fd39" protocol=ttrpc version=3 Apr 17 02:53:42.759910 systemd[1]: Started cri-containerd-1ac3f8e05ce19396356f3e05035ba22fd7eaa5bb47a498e2720dd45da6561f82.scope - libcontainer container 1ac3f8e05ce19396356f3e05035ba22fd7eaa5bb47a498e2720dd45da6561f82. Apr 17 02:53:43.177211 containerd[1591]: time="2026-04-17T02:53:43.175623403Z" level=info msg="StartContainer for \"1ac3f8e05ce19396356f3e05035ba22fd7eaa5bb47a498e2720dd45da6561f82\" returns successfully" Apr 17 02:53:43.656408 kubelet[2815]: I0417 02:53:43.655942 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-58fdb9b8fb-4dckf" podStartSLOduration=77.378722198 podStartE2EDuration="1m30.655870551s" podCreationTimestamp="2026-04-17 02:52:13 +0000 UTC" firstStartedPulling="2026-04-17 02:53:28.716307722 +0000 UTC m=+118.866254835" lastFinishedPulling="2026-04-17 02:53:41.993456061 +0000 UTC m=+132.143403188" observedRunningTime="2026-04-17 02:53:43.650097642 +0000 UTC m=+133.800044755" watchObservedRunningTime="2026-04-17 02:53:43.655870551 +0000 UTC m=+133.805817674" Apr 17 02:53:45.468455 systemd-networkd[1476]: vxlan.calico: Link UP Apr 17 02:53:45.471250 systemd-networkd[1476]: vxlan.calico: Gained carrier Apr 17 02:53:46.759254 systemd-networkd[1476]: vxlan.calico: Gained IPv6LL Apr 17 02:53:48.740481 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount215178230.mount: Deactivated successfully. Apr 17 02:53:48.977665 containerd[1591]: time="2026-04-17T02:53:48.977429550Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:49.079251 containerd[1591]: time="2026-04-17T02:53:49.067521074Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Apr 17 02:53:49.117928 containerd[1591]: time="2026-04-17T02:53:49.117538814Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:49.192872 containerd[1591]: time="2026-04-17T02:53:49.192363911Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:53:49.248833 containerd[1591]: time="2026-04-17T02:53:49.247690790Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 7.252237939s" Apr 17 02:53:49.250007 containerd[1591]: time="2026-04-17T02:53:49.249229478Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Apr 17 02:53:49.353330 containerd[1591]: time="2026-04-17T02:53:49.350438469Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Apr 17 02:53:49.529509 containerd[1591]: time="2026-04-17T02:53:49.528830407Z" level=info msg="CreateContainer within sandbox \"37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Apr 17 02:53:49.861882 containerd[1591]: time="2026-04-17T02:53:49.859667982Z" level=info msg="Container a13add0a9e28934abdc4369d868cb4871e7a13c0d5324a4c52f8fcd02b153718: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:53:49.925289 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3845147551.mount: Deactivated successfully. Apr 17 02:53:50.098246 containerd[1591]: time="2026-04-17T02:53:50.097494126Z" level=info msg="CreateContainer within sandbox \"37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"a13add0a9e28934abdc4369d868cb4871e7a13c0d5324a4c52f8fcd02b153718\"" Apr 17 02:53:50.111486 containerd[1591]: time="2026-04-17T02:53:50.111410568Z" level=info msg="StartContainer for \"a13add0a9e28934abdc4369d868cb4871e7a13c0d5324a4c52f8fcd02b153718\"" Apr 17 02:53:50.145464 containerd[1591]: time="2026-04-17T02:53:50.144367794Z" level=info msg="connecting to shim a13add0a9e28934abdc4369d868cb4871e7a13c0d5324a4c52f8fcd02b153718" address="unix:///run/containerd/s/970383b779f78143a07d32fb252c4d40a8cf3e5ad1ee0493300d7635da384e5c" protocol=ttrpc version=3 Apr 17 02:53:50.533976 systemd[1]: Started cri-containerd-a13add0a9e28934abdc4369d868cb4871e7a13c0d5324a4c52f8fcd02b153718.scope - libcontainer container a13add0a9e28934abdc4369d868cb4871e7a13c0d5324a4c52f8fcd02b153718. Apr 17 02:53:51.741747 containerd[1591]: time="2026-04-17T02:53:51.741192957Z" level=info msg="StartContainer for \"a13add0a9e28934abdc4369d868cb4871e7a13c0d5324a4c52f8fcd02b153718\" returns successfully" Apr 17 02:53:53.173483 kubelet[2815]: E0417 02:53:53.172253 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:53:55.245585 kubelet[2815]: E0417 02:53:55.245410 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:54:04.239536 kubelet[2815]: E0417 02:54:04.238523 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:54:07.180667 containerd[1591]: time="2026-04-17T02:54:07.172484363Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:54:07.180667 containerd[1591]: time="2026-04-17T02:54:07.174466374Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Apr 17 02:54:07.248770 containerd[1591]: time="2026-04-17T02:54:07.248647080Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:54:07.250390 containerd[1591]: time="2026-04-17T02:54:07.250137563Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:54:07.256615 containerd[1591]: time="2026-04-17T02:54:07.256501979Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 17.904032778s" Apr 17 02:54:07.256615 containerd[1591]: time="2026-04-17T02:54:07.256602045Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Apr 17 02:54:07.458216 containerd[1591]: time="2026-04-17T02:54:07.456476438Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Apr 17 02:54:07.690665 containerd[1591]: time="2026-04-17T02:54:07.690541213Z" level=info msg="CreateContainer within sandbox \"a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Apr 17 02:54:07.839436 containerd[1591]: time="2026-04-17T02:54:07.836254541Z" level=info msg="Container db4b93ada77d3f1ac23aba59e9bcab205acc0855a4fd106372a32d98ddd362e4: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:54:07.953770 containerd[1591]: time="2026-04-17T02:54:07.953561029Z" level=info msg="CreateContainer within sandbox \"a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"db4b93ada77d3f1ac23aba59e9bcab205acc0855a4fd106372a32d98ddd362e4\"" Apr 17 02:54:07.959232 containerd[1591]: time="2026-04-17T02:54:07.959140545Z" level=info msg="StartContainer for \"db4b93ada77d3f1ac23aba59e9bcab205acc0855a4fd106372a32d98ddd362e4\"" Apr 17 02:54:07.992226 containerd[1591]: time="2026-04-17T02:54:07.992140723Z" level=info msg="connecting to shim db4b93ada77d3f1ac23aba59e9bcab205acc0855a4fd106372a32d98ddd362e4" address="unix:///run/containerd/s/7dfe8bc3196bdee86ba37ced9b97342e6ac166348e6087c465addc33eb84bf00" protocol=ttrpc version=3 Apr 17 02:54:08.366441 systemd[1]: Started cri-containerd-db4b93ada77d3f1ac23aba59e9bcab205acc0855a4fd106372a32d98ddd362e4.scope - libcontainer container db4b93ada77d3f1ac23aba59e9bcab205acc0855a4fd106372a32d98ddd362e4. Apr 17 02:54:08.866038 containerd[1591]: time="2026-04-17T02:54:08.865792029Z" level=info msg="StartContainer for \"db4b93ada77d3f1ac23aba59e9bcab205acc0855a4fd106372a32d98ddd362e4\" returns successfully" Apr 17 02:54:11.382775 kubelet[2815]: E0417 02:54:11.379553 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:54:11.450002 kubelet[2815]: I0417 02:54:11.435399 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-74db899c95-7npzx" podStartSLOduration=29.132685643 podStartE2EDuration="56.434965729s" podCreationTimestamp="2026-04-17 02:53:15 +0000 UTC" firstStartedPulling="2026-04-17 02:53:22.028482905 +0000 UTC m=+112.178430015" lastFinishedPulling="2026-04-17 02:53:49.330762995 +0000 UTC m=+139.480710101" observedRunningTime="2026-04-17 02:53:52.740225708 +0000 UTC m=+142.890172832" watchObservedRunningTime="2026-04-17 02:54:11.434965729 +0000 UTC m=+161.584912847" Apr 17 02:54:11.544364 kubelet[2815]: I0417 02:54:11.477580 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6fbccc5db4-vvpvq" podStartSLOduration=75.188188261 podStartE2EDuration="1m51.477149372s" podCreationTimestamp="2026-04-17 02:52:20 +0000 UTC" firstStartedPulling="2026-04-17 02:53:31.088968991 +0000 UTC m=+121.238916100" lastFinishedPulling="2026-04-17 02:54:07.377930098 +0000 UTC m=+157.527877211" observedRunningTime="2026-04-17 02:54:11.426340273 +0000 UTC m=+161.576287395" watchObservedRunningTime="2026-04-17 02:54:11.477149372 +0000 UTC m=+161.627096485" Apr 17 02:54:16.813947 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount234202755.mount: Deactivated successfully. Apr 17 02:54:23.014255 containerd[1591]: time="2026-04-17T02:54:23.013874596Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:54:23.017883 containerd[1591]: time="2026-04-17T02:54:23.017786183Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Apr 17 02:54:23.028960 containerd[1591]: time="2026-04-17T02:54:23.028887924Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:54:23.157735 containerd[1591]: time="2026-04-17T02:54:23.157509360Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:54:23.167910 containerd[1591]: time="2026-04-17T02:54:23.167542294Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 15.710494787s" Apr 17 02:54:23.167910 containerd[1591]: time="2026-04-17T02:54:23.167599024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Apr 17 02:54:23.274682 containerd[1591]: time="2026-04-17T02:54:23.273955542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 17 02:54:23.359819 containerd[1591]: time="2026-04-17T02:54:23.358441057Z" level=info msg="CreateContainer within sandbox \"312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Apr 17 02:54:23.464101 containerd[1591]: time="2026-04-17T02:54:23.462666232Z" level=info msg="Container c654cc1f37cbe3d38b15e00ba2f8fb6808cef0bb106cfe98f3fa859742ad5ca2: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:54:23.524596 containerd[1591]: time="2026-04-17T02:54:23.524457977Z" level=info msg="CreateContainer within sandbox \"312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"c654cc1f37cbe3d38b15e00ba2f8fb6808cef0bb106cfe98f3fa859742ad5ca2\"" Apr 17 02:54:23.564795 containerd[1591]: time="2026-04-17T02:54:23.561696900Z" level=info msg="StartContainer for \"c654cc1f37cbe3d38b15e00ba2f8fb6808cef0bb106cfe98f3fa859742ad5ca2\"" Apr 17 02:54:23.601089 containerd[1591]: time="2026-04-17T02:54:23.600688137Z" level=info msg="connecting to shim c654cc1f37cbe3d38b15e00ba2f8fb6808cef0bb106cfe98f3fa859742ad5ca2" address="unix:///run/containerd/s/1ca422bf56111f01482b669136e2c4b7faae00baabf7c1977b1b9e774a80444c" protocol=ttrpc version=3 Apr 17 02:54:23.863828 systemd[1]: Started cri-containerd-c654cc1f37cbe3d38b15e00ba2f8fb6808cef0bb106cfe98f3fa859742ad5ca2.scope - libcontainer container c654cc1f37cbe3d38b15e00ba2f8fb6808cef0bb106cfe98f3fa859742ad5ca2. Apr 17 02:54:24.093386 containerd[1591]: time="2026-04-17T02:54:24.093234484Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:54:24.095997 containerd[1591]: time="2026-04-17T02:54:24.093393991Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Apr 17 02:54:24.142773 containerd[1591]: time="2026-04-17T02:54:24.142179629Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 867.82762ms" Apr 17 02:54:24.143676 containerd[1591]: time="2026-04-17T02:54:24.142812588Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Apr 17 02:54:24.191108 containerd[1591]: time="2026-04-17T02:54:24.190931972Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Apr 17 02:54:24.327767 containerd[1591]: time="2026-04-17T02:54:24.323489382Z" level=info msg="CreateContainer within sandbox \"7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 17 02:54:24.530883 containerd[1591]: time="2026-04-17T02:54:24.530227551Z" level=info msg="StartContainer for \"c654cc1f37cbe3d38b15e00ba2f8fb6808cef0bb106cfe98f3fa859742ad5ca2\" returns successfully" Apr 17 02:54:24.688764 containerd[1591]: time="2026-04-17T02:54:24.672132447Z" level=info msg="Container 23aed4307e3ea6b3718b99b2c56ecc67b1d4c5c822c2ae30a66ecb68d49b61ec: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:54:24.773959 containerd[1591]: time="2026-04-17T02:54:24.770353148Z" level=info msg="CreateContainer within sandbox \"7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"23aed4307e3ea6b3718b99b2c56ecc67b1d4c5c822c2ae30a66ecb68d49b61ec\"" Apr 17 02:54:24.848567 containerd[1591]: time="2026-04-17T02:54:24.848176620Z" level=info msg="StartContainer for \"23aed4307e3ea6b3718b99b2c56ecc67b1d4c5c822c2ae30a66ecb68d49b61ec\"" Apr 17 02:54:24.869205 containerd[1591]: time="2026-04-17T02:54:24.869076228Z" level=info msg="connecting to shim 23aed4307e3ea6b3718b99b2c56ecc67b1d4c5c822c2ae30a66ecb68d49b61ec" address="unix:///run/containerd/s/35208740b07d8d72aa6c45261143899fd07fd6a02b08b4631dbd4136bfccce80" protocol=ttrpc version=3 Apr 17 02:54:25.227617 systemd[1]: Started cri-containerd-23aed4307e3ea6b3718b99b2c56ecc67b1d4c5c822c2ae30a66ecb68d49b61ec.scope - libcontainer container 23aed4307e3ea6b3718b99b2c56ecc67b1d4c5c822c2ae30a66ecb68d49b61ec. Apr 17 02:54:25.813541 kubelet[2815]: I0417 02:54:25.790545 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-7sj6j" podStartSLOduration=81.089677648 podStartE2EDuration="2m11.789646101s" podCreationTimestamp="2026-04-17 02:52:14 +0000 UTC" firstStartedPulling="2026-04-17 02:53:32.495615394 +0000 UTC m=+122.645562508" lastFinishedPulling="2026-04-17 02:54:23.195583846 +0000 UTC m=+173.345530961" observedRunningTime="2026-04-17 02:54:25.754199404 +0000 UTC m=+175.904146516" watchObservedRunningTime="2026-04-17 02:54:25.789646101 +0000 UTC m=+175.939593246" Apr 17 02:54:26.647136 containerd[1591]: time="2026-04-17T02:54:26.647061569Z" level=info msg="StartContainer for \"23aed4307e3ea6b3718b99b2c56ecc67b1d4c5c822c2ae30a66ecb68d49b61ec\" returns successfully" Apr 17 02:54:27.698850 kubelet[2815]: I0417 02:54:27.686544 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-58fdb9b8fb-rtk29" podStartSLOduration=84.943948214 podStartE2EDuration="2m14.670600942s" podCreationTimestamp="2026-04-17 02:52:13 +0000 UTC" firstStartedPulling="2026-04-17 02:53:34.445366105 +0000 UTC m=+124.595313220" lastFinishedPulling="2026-04-17 02:54:24.172018834 +0000 UTC m=+174.321965948" observedRunningTime="2026-04-17 02:54:27.643704976 +0000 UTC m=+177.793652089" watchObservedRunningTime="2026-04-17 02:54:27.670600942 +0000 UTC m=+177.820548055" Apr 17 02:54:29.832692 containerd[1591]: time="2026-04-17T02:54:29.832345609Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:54:29.836487 containerd[1591]: time="2026-04-17T02:54:29.834340602Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Apr 17 02:54:29.852209 containerd[1591]: time="2026-04-17T02:54:29.851401446Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:54:29.945998 containerd[1591]: time="2026-04-17T02:54:29.943296468Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 02:54:29.974164 containerd[1591]: time="2026-04-17T02:54:29.973875891Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 5.782205826s" Apr 17 02:54:29.976860 containerd[1591]: time="2026-04-17T02:54:29.974155859Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Apr 17 02:54:30.023776 containerd[1591]: time="2026-04-17T02:54:30.023057001Z" level=info msg="CreateContainer within sandbox \"b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Apr 17 02:54:30.156845 containerd[1591]: time="2026-04-17T02:54:30.146559490Z" level=info msg="Container 9ab0b258ca76415d10f27744bfe4936fd254cfc8106c2b998897aea40ef665f9: CDI devices from CRI Config.CDIDevices: []" Apr 17 02:54:30.274100 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount355639475.mount: Deactivated successfully. Apr 17 02:54:30.575527 containerd[1591]: time="2026-04-17T02:54:30.575013920Z" level=info msg="CreateContainer within sandbox \"b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"9ab0b258ca76415d10f27744bfe4936fd254cfc8106c2b998897aea40ef665f9\"" Apr 17 02:54:30.599873 containerd[1591]: time="2026-04-17T02:54:30.599104189Z" level=info msg="StartContainer for \"9ab0b258ca76415d10f27744bfe4936fd254cfc8106c2b998897aea40ef665f9\"" Apr 17 02:54:30.732794 containerd[1591]: time="2026-04-17T02:54:30.724607927Z" level=info msg="connecting to shim 9ab0b258ca76415d10f27744bfe4936fd254cfc8106c2b998897aea40ef665f9" address="unix:///run/containerd/s/b532ae8565d1bdd48c4efb9fba9cad9475541238e86a7d68e1c2d3386f467f11" protocol=ttrpc version=3 Apr 17 02:54:31.154988 systemd[1]: Started cri-containerd-9ab0b258ca76415d10f27744bfe4936fd254cfc8106c2b998897aea40ef665f9.scope - libcontainer container 9ab0b258ca76415d10f27744bfe4936fd254cfc8106c2b998897aea40ef665f9. Apr 17 02:54:32.158682 containerd[1591]: time="2026-04-17T02:54:32.156444157Z" level=info msg="StartContainer for \"9ab0b258ca76415d10f27744bfe4936fd254cfc8106c2b998897aea40ef665f9\" returns successfully" Apr 17 02:54:33.475901 kubelet[2815]: I0417 02:54:33.470134 2815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-vnr2q" podStartSLOduration=68.147506685 podStartE2EDuration="2m13.470106795s" podCreationTimestamp="2026-04-17 02:52:20 +0000 UTC" firstStartedPulling="2026-04-17 02:53:24.663759764 +0000 UTC m=+114.813706882" lastFinishedPulling="2026-04-17 02:54:29.986359882 +0000 UTC m=+180.136306992" observedRunningTime="2026-04-17 02:54:33.457351092 +0000 UTC m=+183.607298205" watchObservedRunningTime="2026-04-17 02:54:33.470106795 +0000 UTC m=+183.620053919" Apr 17 02:54:33.957852 kubelet[2815]: I0417 02:54:33.957222 2815 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Apr 17 02:54:33.961435 kubelet[2815]: I0417 02:54:33.960658 2815 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Apr 17 02:54:49.216806 kubelet[2815]: E0417 02:54:49.213956 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:54:54.591572 kubelet[2815]: E0417 02:54:54.578408 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:54:56.671582 containerd[1591]: time="2026-04-17T02:54:56.669115578Z" level=error msg="ExecSync for \"817d2d727bd5978bb30e080d429a071dedf866acf3e576b7803f527897950a8e\" failed" error="rpc error: code = DeadlineExceeded desc = failed to exec in container: timeout 10s exceeded: context deadline exceeded" Apr 17 02:54:56.716877 kubelet[2815]: E0417 02:54:56.698646 2815 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = DeadlineExceeded desc = failed to exec in container: timeout 10s exceeded: context deadline exceeded" containerID="817d2d727bd5978bb30e080d429a071dedf866acf3e576b7803f527897950a8e" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] Apr 17 02:55:04.156092 kubelet[2815]: E0417 02:55:04.155843 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:55:07.167454 kubelet[2815]: E0417 02:55:07.167252 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:55:21.181407 kubelet[2815]: E0417 02:55:21.147282 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:55:22.163911 kubelet[2815]: E0417 02:55:22.163061 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:55:31.323408 kubelet[2815]: E0417 02:55:31.321645 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:56:10.164755 kubelet[2815]: E0417 02:56:10.164642 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:56:12.167864 kubelet[2815]: E0417 02:56:12.167223 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:56:12.174793 kubelet[2815]: E0417 02:56:12.167300 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:56:17.087160 containerd[1591]: time="2026-04-17T02:56:17.056382443Z" level=warning msg="container event discarded" container=e13303455056b094dffcae933e74ebc857d5fddda90b6c6410d8594467b93bf2 type=CONTAINER_CREATED_EVENT Apr 17 02:56:17.117304 containerd[1591]: time="2026-04-17T02:56:17.090785470Z" level=warning msg="container event discarded" container=e13303455056b094dffcae933e74ebc857d5fddda90b6c6410d8594467b93bf2 type=CONTAINER_STARTED_EVENT Apr 17 02:56:17.150529 containerd[1591]: time="2026-04-17T02:56:17.140422656Z" level=warning msg="container event discarded" container=34db2668ce07c2adde60359433cc9d1636cd629e27a322f34a52e98210ef9454 type=CONTAINER_CREATED_EVENT Apr 17 02:56:17.152194 containerd[1591]: time="2026-04-17T02:56:17.150461109Z" level=warning msg="container event discarded" container=34db2668ce07c2adde60359433cc9d1636cd629e27a322f34a52e98210ef9454 type=CONTAINER_STARTED_EVENT Apr 17 02:56:17.152517 containerd[1591]: time="2026-04-17T02:56:17.151918086Z" level=warning msg="container event discarded" container=0dda3cd6f119ebba1ff1d2be0acac79d200a87607faabc2f2409103ba508cb72 type=CONTAINER_CREATED_EVENT Apr 17 02:56:17.152517 containerd[1591]: time="2026-04-17T02:56:17.152301510Z" level=warning msg="container event discarded" container=0dda3cd6f119ebba1ff1d2be0acac79d200a87607faabc2f2409103ba508cb72 type=CONTAINER_STARTED_EVENT Apr 17 02:56:17.152517 containerd[1591]: time="2026-04-17T02:56:17.152333279Z" level=warning msg="container event discarded" container=f5db9cb312778702aea06d6cb98afb8b5ca3480886b541ef5bfbd31af1cc01e4 type=CONTAINER_CREATED_EVENT Apr 17 02:56:17.152517 containerd[1591]: time="2026-04-17T02:56:17.152344962Z" level=warning msg="container event discarded" container=0078b4ad2ea05b6c8341c641459daae476d9c724f52e5f45af1d51b6c1dbc049 type=CONTAINER_CREATED_EVENT Apr 17 02:56:17.152517 containerd[1591]: time="2026-04-17T02:56:17.152431159Z" level=warning msg="container event discarded" container=224b84b170a43575538ea399b6c024e7f6e7a7911065418d1f5e8a0a3bcc9ae7 type=CONTAINER_CREATED_EVENT Apr 17 02:56:17.362396 containerd[1591]: time="2026-04-17T02:56:17.359529901Z" level=warning msg="container event discarded" container=f5db9cb312778702aea06d6cb98afb8b5ca3480886b541ef5bfbd31af1cc01e4 type=CONTAINER_STARTED_EVENT Apr 17 02:56:17.399799 containerd[1591]: time="2026-04-17T02:56:17.399293067Z" level=warning msg="container event discarded" container=0078b4ad2ea05b6c8341c641459daae476d9c724f52e5f45af1d51b6c1dbc049 type=CONTAINER_STARTED_EVENT Apr 17 02:56:17.550518 containerd[1591]: time="2026-04-17T02:56:17.550007544Z" level=warning msg="container event discarded" container=224b84b170a43575538ea399b6c024e7f6e7a7911065418d1f5e8a0a3bcc9ae7 type=CONTAINER_STARTED_EVENT Apr 17 02:56:32.222396 kubelet[2815]: E0417 02:56:32.222083 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:56:33.210003 kubelet[2815]: E0417 02:56:33.209043 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:56:35.713820 containerd[1591]: time="2026-04-17T02:56:35.710201997Z" level=warning msg="container event discarded" container=aeea2a0df949f00ae4bade436ddf0addb4cbbcdd0ba49e8ca74926306125cdd7 type=CONTAINER_CREATED_EVENT Apr 17 02:56:35.713820 containerd[1591]: time="2026-04-17T02:56:35.711007647Z" level=warning msg="container event discarded" container=aeea2a0df949f00ae4bade436ddf0addb4cbbcdd0ba49e8ca74926306125cdd7 type=CONTAINER_STARTED_EVENT Apr 17 02:56:35.866994 containerd[1591]: time="2026-04-17T02:56:35.865577386Z" level=warning msg="container event discarded" container=14e373a6470e868db7c912aea094d0912af5c2d59009a662c1f4632012bef17c type=CONTAINER_CREATED_EVENT Apr 17 02:56:36.183630 containerd[1591]: time="2026-04-17T02:56:36.183354213Z" level=warning msg="container event discarded" container=14e373a6470e868db7c912aea094d0912af5c2d59009a662c1f4632012bef17c type=CONTAINER_STARTED_EVENT Apr 17 02:56:36.315978 containerd[1591]: time="2026-04-17T02:56:36.303217048Z" level=warning msg="container event discarded" container=d26af431b516b6454db73520e2f5db7b95217135bbc5f81553184ea0392e7514 type=CONTAINER_CREATED_EVENT Apr 17 02:56:36.321134 containerd[1591]: time="2026-04-17T02:56:36.320199729Z" level=warning msg="container event discarded" container=d26af431b516b6454db73520e2f5db7b95217135bbc5f81553184ea0392e7514 type=CONTAINER_STARTED_EVENT Apr 17 02:56:36.487105 kubelet[2815]: E0417 02:56:36.483290 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:56:40.193777 kubelet[2815]: E0417 02:56:40.193610 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:56:41.750998 containerd[1591]: time="2026-04-17T02:56:41.750175187Z" level=warning msg="container event discarded" container=f8696890ae0bab14f2ee30a53c344df75b5bbb8795fc62b28f67780f3655bc52 type=CONTAINER_CREATED_EVENT Apr 17 02:56:41.960463 containerd[1591]: time="2026-04-17T02:56:41.959895883Z" level=warning msg="container event discarded" container=f8696890ae0bab14f2ee30a53c344df75b5bbb8795fc62b28f67780f3655bc52 type=CONTAINER_STARTED_EVENT Apr 17 02:57:16.273371 kubelet[2815]: E0417 02:57:16.272993 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:57:19.270146 containerd[1591]: time="2026-04-17T02:57:19.267040161Z" level=warning msg="container event discarded" container=97c4d1dc104e340a01504a628a7ab8b669f00fcc78e14419699a72e4bc8c3128 type=CONTAINER_CREATED_EVENT Apr 17 02:57:19.276770 containerd[1591]: time="2026-04-17T02:57:19.273328275Z" level=warning msg="container event discarded" container=97c4d1dc104e340a01504a628a7ab8b669f00fcc78e14419699a72e4bc8c3128 type=CONTAINER_STARTED_EVENT Apr 17 02:57:20.374910 kubelet[2815]: E0417 02:57:20.374796 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:57:21.195652 containerd[1591]: time="2026-04-17T02:57:21.195278638Z" level=warning msg="container event discarded" container=596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58 type=CONTAINER_CREATED_EVENT Apr 17 02:57:21.198277 containerd[1591]: time="2026-04-17T02:57:21.197464402Z" level=warning msg="container event discarded" container=596c48fb2b72db55d746dd322e977311aaeba1a2a22c59d5edec09ee8e4f0a58 type=CONTAINER_STARTED_EVENT Apr 17 02:57:24.679765 containerd[1591]: time="2026-04-17T02:57:24.678161273Z" level=warning msg="container event discarded" container=51388955b0b26766fb6f2f0d4320423050b5c5a6d2fcfac9c4e50ebdf3f65345 type=CONTAINER_CREATED_EVENT Apr 17 02:57:24.980345 containerd[1591]: time="2026-04-17T02:57:24.977173598Z" level=warning msg="container event discarded" container=51388955b0b26766fb6f2f0d4320423050b5c5a6d2fcfac9c4e50ebdf3f65345 type=CONTAINER_STARTED_EVENT Apr 17 02:57:27.105442 containerd[1591]: time="2026-04-17T02:57:27.104294018Z" level=warning msg="container event discarded" container=be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e type=CONTAINER_CREATED_EVENT Apr 17 02:57:27.947359 containerd[1591]: time="2026-04-17T02:57:27.945593794Z" level=warning msg="container event discarded" container=be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e type=CONTAINER_STARTED_EVENT Apr 17 02:57:28.392505 containerd[1591]: time="2026-04-17T02:57:28.390189469Z" level=warning msg="container event discarded" container=be3e3b4ec5f9974c1fb865ab923aac83d2558d139da0065c781849fb1f6c340e type=CONTAINER_STOPPED_EVENT Apr 17 02:57:30.433665 kubelet[2815]: E0417 02:57:30.433357 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:57:45.151533 kubelet[2815]: E0417 02:57:45.151483 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:57:47.154251 kubelet[2815]: E0417 02:57:47.154191 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:57:51.181376 kubelet[2815]: E0417 02:57:51.181216 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:57:52.698462 containerd[1591]: time="2026-04-17T02:57:52.696681313Z" level=warning msg="container event discarded" container=5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316 type=CONTAINER_CREATED_EVENT Apr 17 02:57:53.093414 containerd[1591]: time="2026-04-17T02:57:53.092912334Z" level=warning msg="container event discarded" container=5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316 type=CONTAINER_STARTED_EVENT Apr 17 02:57:53.720299 containerd[1591]: time="2026-04-17T02:57:53.719812019Z" level=warning msg="container event discarded" container=5a89b90594254cab3b424d4f30c24552a1235c6b760b138b3e5e5e6738150316 type=CONTAINER_STOPPED_EVENT Apr 17 02:57:55.156923 kubelet[2815]: E0417 02:57:55.156513 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:58:00.070702 containerd[1591]: time="2026-04-17T02:58:00.070375474Z" level=warning msg="container event discarded" container=77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd type=CONTAINER_CREATED_EVENT Apr 17 02:58:00.447679 containerd[1591]: time="2026-04-17T02:58:00.444540482Z" level=warning msg="container event discarded" container=77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd type=CONTAINER_STARTED_EVENT Apr 17 02:58:04.436761 containerd[1591]: time="2026-04-17T02:58:04.435464233Z" level=warning msg="container event discarded" container=77eb3f9503c53852756ce5e4559df523aac6b7260481d02bee3f3ffb9590e4bd type=CONTAINER_STOPPED_EVENT Apr 17 02:58:06.456411 containerd[1591]: time="2026-04-17T02:58:06.453496480Z" level=warning msg="container event discarded" container=817d2d727bd5978bb30e080d429a071dedf866acf3e576b7803f527897950a8e type=CONTAINER_CREATED_EVENT Apr 17 02:58:08.972150 containerd[1591]: time="2026-04-17T02:58:08.971399867Z" level=warning msg="container event discarded" container=817d2d727bd5978bb30e080d429a071dedf866acf3e576b7803f527897950a8e type=CONTAINER_STARTED_EVENT Apr 17 02:58:19.386472 systemd[1]: Started sshd@9-10.0.0.21:22-10.0.0.1:56112.service - OpenSSH per-connection server daemon (10.0.0.1:56112). Apr 17 02:58:19.729359 sshd[6614]: Accepted publickey for core from 10.0.0.1 port 56112 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:58:19.742217 sshd-session[6614]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:58:19.849157 systemd-logind[1579]: New session 10 of user core. Apr 17 02:58:19.861796 systemd[1]: Started session-10.scope - Session 10 of User core. Apr 17 02:58:20.158772 kubelet[2815]: E0417 02:58:20.158569 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:58:21.626354 sshd[6617]: Connection closed by 10.0.0.1 port 56112 Apr 17 02:58:21.635112 sshd-session[6614]: pam_unix(sshd:session): session closed for user core Apr 17 02:58:21.682927 systemd[1]: sshd@9-10.0.0.21:22-10.0.0.1:56112.service: Deactivated successfully. Apr 17 02:58:21.747466 systemd[1]: session-10.scope: Deactivated successfully. Apr 17 02:58:21.747954 systemd[1]: session-10.scope: Consumed 1.084s CPU time, 51.1M memory peak. Apr 17 02:58:21.766565 systemd-logind[1579]: Session 10 logged out. Waiting for processes to exit. Apr 17 02:58:21.769700 systemd-logind[1579]: Removed session 10. Apr 17 02:58:22.027768 containerd[1591]: time="2026-04-17T02:58:22.022696825Z" level=warning msg="container event discarded" container=37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6 type=CONTAINER_CREATED_EVENT Apr 17 02:58:22.027768 containerd[1591]: time="2026-04-17T02:58:22.027803268Z" level=warning msg="container event discarded" container=37f7af7420523bec3640e79187ba7400d0a165ad92fecfaf635a87e2b15dcaa6 type=CONTAINER_STARTED_EVENT Apr 17 02:58:24.571241 containerd[1591]: time="2026-04-17T02:58:24.567763191Z" level=warning msg="container event discarded" container=b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120 type=CONTAINER_CREATED_EVENT Apr 17 02:58:24.575563 containerd[1591]: time="2026-04-17T02:58:24.571802040Z" level=warning msg="container event discarded" container=b4ed30fafdec2717e6d35b5e65ac3cc512d8bea5f2a5fdc4ffec3a0a897f0120 type=CONTAINER_STARTED_EVENT Apr 17 02:58:25.351642 containerd[1591]: time="2026-04-17T02:58:25.351099219Z" level=warning msg="container event discarded" container=3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171 type=CONTAINER_CREATED_EVENT Apr 17 02:58:25.351642 containerd[1591]: time="2026-04-17T02:58:25.351618213Z" level=warning msg="container event discarded" container=3c0547de3e61125c582eff6753d2335168af0f771513a26576aaa2a6e8737171 type=CONTAINER_STARTED_EVENT Apr 17 02:58:25.611002 containerd[1591]: time="2026-04-17T02:58:25.608521834Z" level=warning msg="container event discarded" container=0805987caa92c3406b97c07c809aaa395f59fb3fa49a670f129b63c9bfbb4e31 type=CONTAINER_CREATED_EVENT Apr 17 02:58:26.169876 containerd[1591]: time="2026-04-17T02:58:26.167620569Z" level=warning msg="container event discarded" container=0805987caa92c3406b97c07c809aaa395f59fb3fa49a670f129b63c9bfbb4e31 type=CONTAINER_STARTED_EVENT Apr 17 02:58:26.766145 systemd[1]: Started sshd@10-10.0.0.21:22-10.0.0.1:54514.service - OpenSSH per-connection server daemon (10.0.0.1:54514). Apr 17 02:58:27.017636 sshd[6659]: Accepted publickey for core from 10.0.0.1 port 54514 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:58:27.024158 sshd-session[6659]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:58:27.051498 systemd-logind[1579]: New session 11 of user core. Apr 17 02:58:27.065590 systemd[1]: Started session-11.scope - Session 11 of User core. Apr 17 02:58:27.273375 containerd[1591]: time="2026-04-17T02:58:27.272801323Z" level=warning msg="container event discarded" container=91013b0a19cb7668c0b5d594d112fcfb795d6ac4f3e0af0af0720b1557c09137 type=CONTAINER_CREATED_EVENT Apr 17 02:58:28.447643 sshd[6662]: Connection closed by 10.0.0.1 port 54514 Apr 17 02:58:28.451176 sshd-session[6659]: pam_unix(sshd:session): session closed for user core Apr 17 02:58:28.468195 systemd[1]: sshd@10-10.0.0.21:22-10.0.0.1:54514.service: Deactivated successfully. Apr 17 02:58:28.481072 systemd[1]: session-11.scope: Deactivated successfully. Apr 17 02:58:28.492739 systemd-logind[1579]: Session 11 logged out. Waiting for processes to exit. Apr 17 02:58:28.552946 systemd-logind[1579]: Removed session 11. Apr 17 02:58:28.656438 containerd[1591]: time="2026-04-17T02:58:28.656004481Z" level=warning msg="container event discarded" container=d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882 type=CONTAINER_CREATED_EVENT Apr 17 02:58:28.672886 containerd[1591]: time="2026-04-17T02:58:28.671631057Z" level=warning msg="container event discarded" container=d3f0847a3fef67f1c14e8a1fb64d4b5e86a104251336072a48f33bc5eca52882 type=CONTAINER_STARTED_EVENT Apr 17 02:58:28.748061 containerd[1591]: time="2026-04-17T02:58:28.747326083Z" level=warning msg="container event discarded" container=91013b0a19cb7668c0b5d594d112fcfb795d6ac4f3e0af0af0720b1557c09137 type=CONTAINER_STARTED_EVENT Apr 17 02:58:30.974425 containerd[1591]: time="2026-04-17T02:58:30.974037647Z" level=warning msg="container event discarded" container=a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386 type=CONTAINER_CREATED_EVENT Apr 17 02:58:30.974425 containerd[1591]: time="2026-04-17T02:58:30.974402724Z" level=warning msg="container event discarded" container=a5616d5d6bcfaf66dce5cf66a390851af4dfcb576f06b727bccb967701f13386 type=CONTAINER_STARTED_EVENT Apr 17 02:58:32.463266 containerd[1591]: time="2026-04-17T02:58:32.462844189Z" level=warning msg="container event discarded" container=312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a type=CONTAINER_CREATED_EVENT Apr 17 02:58:32.466259 containerd[1591]: time="2026-04-17T02:58:32.463683657Z" level=warning msg="container event discarded" container=312e90b1be4c6891c8189148ab7046e3bfb63cc22596af302b0b6e366ffe608a type=CONTAINER_STARTED_EVENT Apr 17 02:58:32.862975 containerd[1591]: time="2026-04-17T02:58:32.862623376Z" level=warning msg="container event discarded" container=127766f7e73f9a7ee6b18163f5ffe8054d411a2081ffec76be3d9ed67d04241f type=CONTAINER_CREATED_EVENT Apr 17 02:58:33.499641 systemd[1]: Started sshd@11-10.0.0.21:22-10.0.0.1:42522.service - OpenSSH per-connection server daemon (10.0.0.1:42522). Apr 17 02:58:33.777492 sshd[6720]: Accepted publickey for core from 10.0.0.1 port 42522 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:58:33.786942 sshd-session[6720]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:58:33.828889 systemd-logind[1579]: New session 12 of user core. Apr 17 02:58:33.841427 systemd[1]: Started session-12.scope - Session 12 of User core. Apr 17 02:58:34.387641 containerd[1591]: time="2026-04-17T02:58:34.387086707Z" level=warning msg="container event discarded" container=127766f7e73f9a7ee6b18163f5ffe8054d411a2081ffec76be3d9ed67d04241f type=CONTAINER_STARTED_EVENT Apr 17 02:58:34.400099 containerd[1591]: time="2026-04-17T02:58:34.399902960Z" level=warning msg="container event discarded" container=7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35 type=CONTAINER_CREATED_EVENT Apr 17 02:58:34.400099 containerd[1591]: time="2026-04-17T02:58:34.400093057Z" level=warning msg="container event discarded" container=7643edd6669a722af10649e1f4987292a64bbdf753cc7ffccb57bba5660eaa35 type=CONTAINER_STARTED_EVENT Apr 17 02:58:34.464690 sshd[6723]: Connection closed by 10.0.0.1 port 42522 Apr 17 02:58:34.466809 sshd-session[6720]: pam_unix(sshd:session): session closed for user core Apr 17 02:58:34.482305 systemd[1]: sshd@11-10.0.0.21:22-10.0.0.1:42522.service: Deactivated successfully. Apr 17 02:58:34.490070 systemd[1]: session-12.scope: Deactivated successfully. Apr 17 02:58:34.498338 systemd-logind[1579]: Session 12 logged out. Waiting for processes to exit. Apr 17 02:58:34.552162 systemd-logind[1579]: Removed session 12. Apr 17 02:58:36.260220 containerd[1591]: time="2026-04-17T02:58:36.259970766Z" level=warning msg="container event discarded" container=df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1 type=CONTAINER_CREATED_EVENT Apr 17 02:58:36.260220 containerd[1591]: time="2026-04-17T02:58:36.260202552Z" level=warning msg="container event discarded" container=df5c6874e4d07ca64f01cb2d603b9b48feaa741a6a33c92c884389397a2159e1 type=CONTAINER_STARTED_EVENT Apr 17 02:58:36.716388 containerd[1591]: time="2026-04-17T02:58:36.711291799Z" level=warning msg="container event discarded" container=63758c5993624e434898bf078e42ae9179353479b9a8eb53cd8fd43a331cc3b9 type=CONTAINER_CREATED_EVENT Apr 17 02:58:37.166666 kubelet[2815]: E0417 02:58:37.165633 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:58:37.269898 containerd[1591]: time="2026-04-17T02:58:37.269765824Z" level=warning msg="container event discarded" container=63758c5993624e434898bf078e42ae9179353479b9a8eb53cd8fd43a331cc3b9 type=CONTAINER_STARTED_EVENT Apr 17 02:58:39.501752 systemd[1]: Started sshd@12-10.0.0.21:22-10.0.0.1:56072.service - OpenSSH per-connection server daemon (10.0.0.1:56072). Apr 17 02:58:39.686560 sshd[6739]: Accepted publickey for core from 10.0.0.1 port 56072 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:58:39.689126 sshd-session[6739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:58:39.719641 systemd-logind[1579]: New session 13 of user core. Apr 17 02:58:39.732662 systemd[1]: Started session-13.scope - Session 13 of User core. Apr 17 02:58:40.683110 sshd[6742]: Connection closed by 10.0.0.1 port 56072 Apr 17 02:58:40.683680 sshd-session[6739]: pam_unix(sshd:session): session closed for user core Apr 17 02:58:40.725057 systemd[1]: sshd@12-10.0.0.21:22-10.0.0.1:56072.service: Deactivated successfully. Apr 17 02:58:40.738112 systemd[1]: session-13.scope: Deactivated successfully. Apr 17 02:58:40.748771 systemd-logind[1579]: Session 13 logged out. Waiting for processes to exit. Apr 17 02:58:40.755980 systemd-logind[1579]: Removed session 13. Apr 17 02:58:42.239480 containerd[1591]: time="2026-04-17T02:58:42.238823450Z" level=warning msg="container event discarded" container=1ac3f8e05ce19396356f3e05035ba22fd7eaa5bb47a498e2720dd45da6561f82 type=CONTAINER_CREATED_EVENT Apr 17 02:58:43.173090 containerd[1591]: time="2026-04-17T02:58:43.169816098Z" level=warning msg="container event discarded" container=1ac3f8e05ce19396356f3e05035ba22fd7eaa5bb47a498e2720dd45da6561f82 type=CONTAINER_STARTED_EVENT Apr 17 02:58:45.792532 systemd[1]: Started sshd@13-10.0.0.21:22-10.0.0.1:56076.service - OpenSSH per-connection server daemon (10.0.0.1:56076). Apr 17 02:58:46.073051 sshd[6780]: Accepted publickey for core from 10.0.0.1 port 56076 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:58:46.078336 sshd-session[6780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:58:46.120986 systemd-logind[1579]: New session 14 of user core. Apr 17 02:58:46.127054 systemd[1]: Started session-14.scope - Session 14 of User core. Apr 17 02:58:46.880611 sshd[6807]: Connection closed by 10.0.0.1 port 56076 Apr 17 02:58:46.881927 sshd-session[6780]: pam_unix(sshd:session): session closed for user core Apr 17 02:58:46.962801 systemd[1]: sshd@13-10.0.0.21:22-10.0.0.1:56076.service: Deactivated successfully. Apr 17 02:58:46.975430 systemd[1]: session-14.scope: Deactivated successfully. Apr 17 02:58:46.997320 systemd-logind[1579]: Session 14 logged out. Waiting for processes to exit. Apr 17 02:58:47.011683 systemd-logind[1579]: Removed session 14. Apr 17 02:58:50.088998 containerd[1591]: time="2026-04-17T02:58:50.088236824Z" level=warning msg="container event discarded" container=a13add0a9e28934abdc4369d868cb4871e7a13c0d5324a4c52f8fcd02b153718 type=CONTAINER_CREATED_EVENT Apr 17 02:58:51.149578 kubelet[2815]: E0417 02:58:51.149497 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:58:51.667894 containerd[1591]: time="2026-04-17T02:58:51.665199622Z" level=warning msg="container event discarded" container=a13add0a9e28934abdc4369d868cb4871e7a13c0d5324a4c52f8fcd02b153718 type=CONTAINER_STARTED_EVENT Apr 17 02:58:51.962788 systemd[1]: Started sshd@14-10.0.0.21:22-10.0.0.1:32794.service - OpenSSH per-connection server daemon (10.0.0.1:32794). Apr 17 02:58:52.057760 sshd[6823]: Accepted publickey for core from 10.0.0.1 port 32794 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:58:52.059481 sshd-session[6823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:58:52.076985 systemd-logind[1579]: New session 15 of user core. Apr 17 02:58:52.086447 systemd[1]: Started session-15.scope - Session 15 of User core. Apr 17 02:58:52.286787 sshd[6826]: Connection closed by 10.0.0.1 port 32794 Apr 17 02:58:52.287563 sshd-session[6823]: pam_unix(sshd:session): session closed for user core Apr 17 02:58:52.294179 systemd[1]: sshd@14-10.0.0.21:22-10.0.0.1:32794.service: Deactivated successfully. Apr 17 02:58:52.297145 systemd[1]: session-15.scope: Deactivated successfully. Apr 17 02:58:52.302243 systemd-logind[1579]: Session 15 logged out. Waiting for processes to exit. Apr 17 02:58:52.315429 systemd-logind[1579]: Removed session 15. Apr 17 02:58:53.149931 kubelet[2815]: E0417 02:58:53.149877 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:58:57.343927 systemd[1]: Started sshd@15-10.0.0.21:22-10.0.0.1:32820.service - OpenSSH per-connection server daemon (10.0.0.1:32820). Apr 17 02:58:57.521586 sshd[6840]: Accepted publickey for core from 10.0.0.1 port 32820 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:58:57.532331 sshd-session[6840]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:58:57.546699 systemd-logind[1579]: New session 16 of user core. Apr 17 02:58:57.566278 systemd[1]: Started session-16.scope - Session 16 of User core. Apr 17 02:58:58.249984 sshd[6843]: Connection closed by 10.0.0.1 port 32820 Apr 17 02:58:58.251562 sshd-session[6840]: pam_unix(sshd:session): session closed for user core Apr 17 02:58:58.275684 systemd[1]: sshd@15-10.0.0.21:22-10.0.0.1:32820.service: Deactivated successfully. Apr 17 02:58:58.293960 systemd[1]: session-16.scope: Deactivated successfully. Apr 17 02:58:58.348020 systemd-logind[1579]: Session 16 logged out. Waiting for processes to exit. Apr 17 02:58:58.387923 systemd-logind[1579]: Removed session 16. Apr 17 02:58:59.173117 kubelet[2815]: E0417 02:58:59.173079 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:59:02.153368 kubelet[2815]: E0417 02:59:02.153260 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:59:03.286308 systemd[1]: Started sshd@16-10.0.0.21:22-10.0.0.1:39580.service - OpenSSH per-connection server daemon (10.0.0.1:39580). Apr 17 02:59:03.482625 sshd[6882]: Accepted publickey for core from 10.0.0.1 port 39580 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:03.494551 sshd-session[6882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:03.542992 systemd-logind[1579]: New session 17 of user core. Apr 17 02:59:03.554069 systemd[1]: Started session-17.scope - Session 17 of User core. Apr 17 02:59:04.577400 sshd[6885]: Connection closed by 10.0.0.1 port 39580 Apr 17 02:59:04.579550 sshd-session[6882]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:04.694257 systemd[1]: sshd@16-10.0.0.21:22-10.0.0.1:39580.service: Deactivated successfully. Apr 17 02:59:04.717593 systemd[1]: session-17.scope: Deactivated successfully. Apr 17 02:59:04.719769 systemd-logind[1579]: Session 17 logged out. Waiting for processes to exit. Apr 17 02:59:04.729287 systemd-logind[1579]: Removed session 17. Apr 17 02:59:06.167402 kubelet[2815]: E0417 02:59:06.167116 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:59:07.905820 containerd[1591]: time="2026-04-17T02:59:07.905188438Z" level=warning msg="container event discarded" container=db4b93ada77d3f1ac23aba59e9bcab205acc0855a4fd106372a32d98ddd362e4 type=CONTAINER_CREATED_EVENT Apr 17 02:59:08.852566 containerd[1591]: time="2026-04-17T02:59:08.852154862Z" level=warning msg="container event discarded" container=db4b93ada77d3f1ac23aba59e9bcab205acc0855a4fd106372a32d98ddd362e4 type=CONTAINER_STARTED_EVENT Apr 17 02:59:09.601059 systemd[1]: Started sshd@17-10.0.0.21:22-10.0.0.1:34206.service - OpenSSH per-connection server daemon (10.0.0.1:34206). Apr 17 02:59:09.674422 sshd[6949]: Accepted publickey for core from 10.0.0.1 port 34206 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:09.675522 sshd-session[6949]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:09.680023 systemd-logind[1579]: New session 18 of user core. Apr 17 02:59:09.691076 systemd[1]: Started session-18.scope - Session 18 of User core. Apr 17 02:59:09.842842 sshd[6952]: Connection closed by 10.0.0.1 port 34206 Apr 17 02:59:09.843138 sshd-session[6949]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:09.846600 systemd[1]: sshd@17-10.0.0.21:22-10.0.0.1:34206.service: Deactivated successfully. Apr 17 02:59:09.848177 systemd[1]: session-18.scope: Deactivated successfully. Apr 17 02:59:09.848961 systemd-logind[1579]: Session 18 logged out. Waiting for processes to exit. Apr 17 02:59:09.850007 systemd-logind[1579]: Removed session 18. Apr 17 02:59:14.865151 systemd[1]: Started sshd@18-10.0.0.21:22-10.0.0.1:34264.service - OpenSSH per-connection server daemon (10.0.0.1:34264). Apr 17 02:59:14.959962 sshd[7011]: Accepted publickey for core from 10.0.0.1 port 34264 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:14.961607 sshd-session[7011]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:14.967129 systemd-logind[1579]: New session 19 of user core. Apr 17 02:59:14.975888 systemd[1]: Started session-19.scope - Session 19 of User core. Apr 17 02:59:15.137374 sshd[7014]: Connection closed by 10.0.0.1 port 34264 Apr 17 02:59:15.138218 sshd-session[7011]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:15.146247 systemd[1]: sshd@18-10.0.0.21:22-10.0.0.1:34264.service: Deactivated successfully. Apr 17 02:59:15.150759 systemd[1]: session-19.scope: Deactivated successfully. Apr 17 02:59:15.155324 systemd-logind[1579]: Session 19 logged out. Waiting for processes to exit. Apr 17 02:59:15.156371 systemd-logind[1579]: Removed session 19. Apr 17 02:59:20.165972 systemd[1]: Started sshd@19-10.0.0.21:22-10.0.0.1:59150.service - OpenSSH per-connection server daemon (10.0.0.1:59150). Apr 17 02:59:20.273652 sshd[7070]: Accepted publickey for core from 10.0.0.1 port 59150 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:20.276642 sshd-session[7070]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:20.373959 systemd-logind[1579]: New session 20 of user core. Apr 17 02:59:20.386129 systemd[1]: Started session-20.scope - Session 20 of User core. Apr 17 02:59:20.619593 sshd[7073]: Connection closed by 10.0.0.1 port 59150 Apr 17 02:59:20.620523 sshd-session[7070]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:20.630964 systemd[1]: sshd@19-10.0.0.21:22-10.0.0.1:59150.service: Deactivated successfully. Apr 17 02:59:20.636555 systemd[1]: session-20.scope: Deactivated successfully. Apr 17 02:59:20.637335 systemd-logind[1579]: Session 20 logged out. Waiting for processes to exit. Apr 17 02:59:20.642400 systemd[1]: Started sshd@20-10.0.0.21:22-10.0.0.1:59158.service - OpenSSH per-connection server daemon (10.0.0.1:59158). Apr 17 02:59:20.643443 systemd-logind[1579]: Removed session 20. Apr 17 02:59:20.790988 sshd[7087]: Accepted publickey for core from 10.0.0.1 port 59158 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:20.795961 sshd-session[7087]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:20.812780 systemd-logind[1579]: New session 21 of user core. Apr 17 02:59:20.827958 systemd[1]: Started session-21.scope - Session 21 of User core. Apr 17 02:59:21.197590 sshd[7090]: Connection closed by 10.0.0.1 port 59158 Apr 17 02:59:21.200255 sshd-session[7087]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:21.219049 systemd[1]: Started sshd@21-10.0.0.21:22-10.0.0.1:59168.service - OpenSSH per-connection server daemon (10.0.0.1:59168). Apr 17 02:59:21.220482 systemd[1]: sshd@20-10.0.0.21:22-10.0.0.1:59158.service: Deactivated successfully. Apr 17 02:59:21.224149 systemd[1]: session-21.scope: Deactivated successfully. Apr 17 02:59:21.226256 systemd-logind[1579]: Session 21 logged out. Waiting for processes to exit. Apr 17 02:59:21.232253 systemd-logind[1579]: Removed session 21. Apr 17 02:59:21.297075 sshd[7098]: Accepted publickey for core from 10.0.0.1 port 59168 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:21.300469 sshd-session[7098]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:21.305994 systemd-logind[1579]: New session 22 of user core. Apr 17 02:59:21.313416 systemd[1]: Started session-22.scope - Session 22 of User core. Apr 17 02:59:21.432112 sshd[7106]: Connection closed by 10.0.0.1 port 59168 Apr 17 02:59:21.432547 sshd-session[7098]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:21.436166 systemd[1]: sshd@21-10.0.0.21:22-10.0.0.1:59168.service: Deactivated successfully. Apr 17 02:59:21.438203 systemd[1]: session-22.scope: Deactivated successfully. Apr 17 02:59:21.438988 systemd-logind[1579]: Session 22 logged out. Waiting for processes to exit. Apr 17 02:59:21.439945 systemd-logind[1579]: Removed session 22. Apr 17 02:59:23.531445 containerd[1591]: time="2026-04-17T02:59:23.530914642Z" level=warning msg="container event discarded" container=c654cc1f37cbe3d38b15e00ba2f8fb6808cef0bb106cfe98f3fa859742ad5ca2 type=CONTAINER_CREATED_EVENT Apr 17 02:59:24.534308 containerd[1591]: time="2026-04-17T02:59:24.534031253Z" level=warning msg="container event discarded" container=c654cc1f37cbe3d38b15e00ba2f8fb6808cef0bb106cfe98f3fa859742ad5ca2 type=CONTAINER_STARTED_EVENT Apr 17 02:59:24.769374 containerd[1591]: time="2026-04-17T02:59:24.768944497Z" level=warning msg="container event discarded" container=23aed4307e3ea6b3718b99b2c56ecc67b1d4c5c822c2ae30a66ecb68d49b61ec type=CONTAINER_CREATED_EVENT Apr 17 02:59:26.324444 update_engine[1580]: I20260417 02:59:26.323534 1580 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Apr 17 02:59:26.324444 update_engine[1580]: I20260417 02:59:26.324265 1580 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Apr 17 02:59:26.327140 update_engine[1580]: I20260417 02:59:26.327068 1580 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Apr 17 02:59:26.327497 update_engine[1580]: I20260417 02:59:26.327467 1580 omaha_request_params.cc:62] Current group set to stable Apr 17 02:59:26.331747 update_engine[1580]: I20260417 02:59:26.329373 1580 update_attempter.cc:499] Already updated boot flags. Skipping. Apr 17 02:59:26.331747 update_engine[1580]: I20260417 02:59:26.329691 1580 update_attempter.cc:643] Scheduling an action processor start. Apr 17 02:59:26.331747 update_engine[1580]: I20260417 02:59:26.330165 1580 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Apr 17 02:59:26.331747 update_engine[1580]: I20260417 02:59:26.330567 1580 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Apr 17 02:59:26.331747 update_engine[1580]: I20260417 02:59:26.330660 1580 omaha_request_action.cc:271] Posting an Omaha request to disabled Apr 17 02:59:26.331747 update_engine[1580]: I20260417 02:59:26.330667 1580 omaha_request_action.cc:272] Request: Apr 17 02:59:26.331747 update_engine[1580]: Apr 17 02:59:26.331747 update_engine[1580]: Apr 17 02:59:26.331747 update_engine[1580]: Apr 17 02:59:26.331747 update_engine[1580]: Apr 17 02:59:26.331747 update_engine[1580]: Apr 17 02:59:26.331747 update_engine[1580]: Apr 17 02:59:26.331747 update_engine[1580]: Apr 17 02:59:26.331747 update_engine[1580]: Apr 17 02:59:26.331747 update_engine[1580]: I20260417 02:59:26.330672 1580 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 17 02:59:26.341162 update_engine[1580]: I20260417 02:59:26.341094 1580 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 17 02:59:26.346863 update_engine[1580]: I20260417 02:59:26.346696 1580 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 17 02:59:26.355573 locksmithd[1626]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Apr 17 02:59:26.356494 update_engine[1580]: E20260417 02:59:26.355850 1580 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 17 02:59:26.356494 update_engine[1580]: I20260417 02:59:26.355983 1580 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Apr 17 02:59:26.480132 systemd[1]: Started sshd@22-10.0.0.21:22-10.0.0.1:59184.service - OpenSSH per-connection server daemon (10.0.0.1:59184). Apr 17 02:59:26.627810 containerd[1591]: time="2026-04-17T02:59:26.627264865Z" level=warning msg="container event discarded" container=23aed4307e3ea6b3718b99b2c56ecc67b1d4c5c822c2ae30a66ecb68d49b61ec type=CONTAINER_STARTED_EVENT Apr 17 02:59:26.693153 sshd[7120]: Accepted publickey for core from 10.0.0.1 port 59184 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:26.699007 sshd-session[7120]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:26.741803 systemd-logind[1579]: New session 23 of user core. Apr 17 02:59:26.758474 systemd[1]: Started session-23.scope - Session 23 of User core. Apr 17 02:59:27.120995 sshd[7123]: Connection closed by 10.0.0.1 port 59184 Apr 17 02:59:27.123478 sshd-session[7120]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:27.162862 systemd[1]: sshd@22-10.0.0.21:22-10.0.0.1:59184.service: Deactivated successfully. Apr 17 02:59:27.174601 systemd[1]: session-23.scope: Deactivated successfully. Apr 17 02:59:27.176018 systemd-logind[1579]: Session 23 logged out. Waiting for processes to exit. Apr 17 02:59:27.179408 systemd-logind[1579]: Removed session 23. Apr 17 02:59:30.540417 containerd[1591]: time="2026-04-17T02:59:30.539830621Z" level=warning msg="container event discarded" container=9ab0b258ca76415d10f27744bfe4936fd254cfc8106c2b998897aea40ef665f9 type=CONTAINER_CREATED_EVENT Apr 17 02:59:32.141516 containerd[1591]: time="2026-04-17T02:59:32.141224096Z" level=warning msg="container event discarded" container=9ab0b258ca76415d10f27744bfe4936fd254cfc8106c2b998897aea40ef665f9 type=CONTAINER_STARTED_EVENT Apr 17 02:59:32.145936 systemd[1]: Started sshd@23-10.0.0.21:22-10.0.0.1:58312.service - OpenSSH per-connection server daemon (10.0.0.1:58312). Apr 17 02:59:32.240209 sshd[7163]: Accepted publickey for core from 10.0.0.1 port 58312 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:32.243015 sshd-session[7163]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:32.249028 systemd-logind[1579]: New session 24 of user core. Apr 17 02:59:32.256491 systemd[1]: Started session-24.scope - Session 24 of User core. Apr 17 02:59:32.523445 sshd[7166]: Connection closed by 10.0.0.1 port 58312 Apr 17 02:59:32.525042 sshd-session[7163]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:32.534984 systemd[1]: sshd@23-10.0.0.21:22-10.0.0.1:58312.service: Deactivated successfully. Apr 17 02:59:32.539332 systemd[1]: session-24.scope: Deactivated successfully. Apr 17 02:59:32.541954 systemd-logind[1579]: Session 24 logged out. Waiting for processes to exit. Apr 17 02:59:32.544084 systemd-logind[1579]: Removed session 24. Apr 17 02:59:36.262777 update_engine[1580]: I20260417 02:59:36.261396 1580 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 17 02:59:36.263951 update_engine[1580]: I20260417 02:59:36.262683 1580 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 17 02:59:36.270593 update_engine[1580]: I20260417 02:59:36.268626 1580 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 17 02:59:36.272734 update_engine[1580]: E20260417 02:59:36.272679 1580 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 17 02:59:36.272848 update_engine[1580]: I20260417 02:59:36.272825 1580 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Apr 17 02:59:37.637961 systemd[1]: Started sshd@24-10.0.0.21:22-10.0.0.1:58316.service - OpenSSH per-connection server daemon (10.0.0.1:58316). Apr 17 02:59:37.756365 sshd[7181]: Accepted publickey for core from 10.0.0.1 port 58316 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:37.760294 sshd-session[7181]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:37.801309 systemd-logind[1579]: New session 25 of user core. Apr 17 02:59:37.811288 systemd[1]: Started session-25.scope - Session 25 of User core. Apr 17 02:59:37.981523 sshd[7184]: Connection closed by 10.0.0.1 port 58316 Apr 17 02:59:37.984356 sshd-session[7181]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:37.995937 systemd[1]: sshd@24-10.0.0.21:22-10.0.0.1:58316.service: Deactivated successfully. Apr 17 02:59:38.000089 systemd[1]: session-25.scope: Deactivated successfully. Apr 17 02:59:38.006606 systemd-logind[1579]: Session 25 logged out. Waiting for processes to exit. Apr 17 02:59:38.029037 systemd[1]: Started sshd@25-10.0.0.21:22-10.0.0.1:58328.service - OpenSSH per-connection server daemon (10.0.0.1:58328). Apr 17 02:59:38.030422 systemd-logind[1579]: Removed session 25. Apr 17 02:59:38.177224 sshd[7197]: Accepted publickey for core from 10.0.0.1 port 58328 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:38.180324 sshd-session[7197]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:38.197172 systemd-logind[1579]: New session 26 of user core. Apr 17 02:59:38.209433 systemd[1]: Started session-26.scope - Session 26 of User core. Apr 17 02:59:38.678681 sshd[7200]: Connection closed by 10.0.0.1 port 58328 Apr 17 02:59:38.679123 sshd-session[7197]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:38.694826 systemd[1]: sshd@25-10.0.0.21:22-10.0.0.1:58328.service: Deactivated successfully. Apr 17 02:59:38.698588 systemd[1]: session-26.scope: Deactivated successfully. Apr 17 02:59:38.699904 systemd-logind[1579]: Session 26 logged out. Waiting for processes to exit. Apr 17 02:59:38.707052 systemd[1]: Started sshd@26-10.0.0.21:22-10.0.0.1:58336.service - OpenSSH per-connection server daemon (10.0.0.1:58336). Apr 17 02:59:38.712583 systemd-logind[1579]: Removed session 26. Apr 17 02:59:38.933746 sshd[7212]: Accepted publickey for core from 10.0.0.1 port 58336 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:38.935317 sshd-session[7212]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:38.953767 systemd-logind[1579]: New session 27 of user core. Apr 17 02:59:38.966989 systemd[1]: Started session-27.scope - Session 27 of User core. Apr 17 02:59:40.250745 sshd[7215]: Connection closed by 10.0.0.1 port 58336 Apr 17 02:59:40.254501 sshd-session[7212]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:40.279270 systemd[1]: Started sshd@27-10.0.0.21:22-10.0.0.1:36866.service - OpenSSH per-connection server daemon (10.0.0.1:36866). Apr 17 02:59:40.344341 systemd[1]: sshd@26-10.0.0.21:22-10.0.0.1:58336.service: Deactivated successfully. Apr 17 02:59:40.354561 systemd[1]: session-27.scope: Deactivated successfully. Apr 17 02:59:40.354867 systemd[1]: session-27.scope: Consumed 1.070s CPU time, 43.1M memory peak. Apr 17 02:59:40.355662 systemd-logind[1579]: Session 27 logged out. Waiting for processes to exit. Apr 17 02:59:40.367583 systemd-logind[1579]: Removed session 27. Apr 17 02:59:40.480961 sshd[7237]: Accepted publickey for core from 10.0.0.1 port 36866 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:40.487597 sshd-session[7237]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:40.512480 systemd-logind[1579]: New session 28 of user core. Apr 17 02:59:40.525209 systemd[1]: Started session-28.scope - Session 28 of User core. Apr 17 02:59:41.243273 sshd[7244]: Connection closed by 10.0.0.1 port 36866 Apr 17 02:59:41.248075 sshd-session[7237]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:41.262279 systemd[1]: sshd@27-10.0.0.21:22-10.0.0.1:36866.service: Deactivated successfully. Apr 17 02:59:41.267002 systemd[1]: session-28.scope: Deactivated successfully. Apr 17 02:59:41.271494 systemd-logind[1579]: Session 28 logged out. Waiting for processes to exit. Apr 17 02:59:41.277058 systemd[1]: Started sshd@28-10.0.0.21:22-10.0.0.1:36882.service - OpenSSH per-connection server daemon (10.0.0.1:36882). Apr 17 02:59:41.278575 systemd-logind[1579]: Removed session 28. Apr 17 02:59:41.381772 sshd[7258]: Accepted publickey for core from 10.0.0.1 port 36882 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:41.388511 sshd-session[7258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:41.425934 systemd-logind[1579]: New session 29 of user core. Apr 17 02:59:41.436111 systemd[1]: Started session-29.scope - Session 29 of User core. Apr 17 02:59:41.880343 sshd[7261]: Connection closed by 10.0.0.1 port 36882 Apr 17 02:59:41.880981 sshd-session[7258]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:41.894297 systemd[1]: sshd@28-10.0.0.21:22-10.0.0.1:36882.service: Deactivated successfully. Apr 17 02:59:41.897432 systemd[1]: session-29.scope: Deactivated successfully. Apr 17 02:59:41.898074 systemd-logind[1579]: Session 29 logged out. Waiting for processes to exit. Apr 17 02:59:41.901690 systemd-logind[1579]: Removed session 29. Apr 17 02:59:46.260666 update_engine[1580]: I20260417 02:59:46.260288 1580 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 17 02:59:46.263370 update_engine[1580]: I20260417 02:59:46.261276 1580 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 17 02:59:46.263998 update_engine[1580]: I20260417 02:59:46.263916 1580 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 17 02:59:46.275019 update_engine[1580]: E20260417 02:59:46.274682 1580 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 17 02:59:46.275019 update_engine[1580]: I20260417 02:59:46.274951 1580 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Apr 17 02:59:46.914403 systemd[1]: Started sshd@29-10.0.0.21:22-10.0.0.1:36890.service - OpenSSH per-connection server daemon (10.0.0.1:36890). Apr 17 02:59:47.138870 sshd[7321]: Accepted publickey for core from 10.0.0.1 port 36890 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:47.142640 sshd-session[7321]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:47.147300 kubelet[2815]: E0417 02:59:47.147268 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:59:47.149379 systemd-logind[1579]: New session 30 of user core. Apr 17 02:59:47.162951 systemd[1]: Started session-30.scope - Session 30 of User core. Apr 17 02:59:47.689943 sshd[7324]: Connection closed by 10.0.0.1 port 36890 Apr 17 02:59:47.691271 sshd-session[7321]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:47.718394 systemd[1]: sshd@29-10.0.0.21:22-10.0.0.1:36890.service: Deactivated successfully. Apr 17 02:59:47.737818 systemd[1]: session-30.scope: Deactivated successfully. Apr 17 02:59:47.751768 systemd-logind[1579]: Session 30 logged out. Waiting for processes to exit. Apr 17 02:59:47.773255 systemd-logind[1579]: Removed session 30. Apr 17 02:59:48.199332 kubelet[2815]: E0417 02:59:48.199014 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:59:52.736261 systemd[1]: Started sshd@30-10.0.0.21:22-10.0.0.1:45084.service - OpenSSH per-connection server daemon (10.0.0.1:45084). Apr 17 02:59:52.978247 sshd[7338]: Accepted publickey for core from 10.0.0.1 port 45084 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:53.070890 sshd-session[7338]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:53.119157 systemd-logind[1579]: New session 31 of user core. Apr 17 02:59:53.122127 systemd[1]: Started session-31.scope - Session 31 of User core. Apr 17 02:59:53.565501 sshd[7341]: Connection closed by 10.0.0.1 port 45084 Apr 17 02:59:53.578418 sshd-session[7338]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:53.668861 systemd[1]: sshd@30-10.0.0.21:22-10.0.0.1:45084.service: Deactivated successfully. Apr 17 02:59:53.678885 systemd[1]: session-31.scope: Deactivated successfully. Apr 17 02:59:53.696414 systemd-logind[1579]: Session 31 logged out. Waiting for processes to exit. Apr 17 02:59:53.723903 systemd-logind[1579]: Removed session 31. Apr 17 02:59:54.151166 kubelet[2815]: E0417 02:59:54.151083 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 02:59:56.260284 update_engine[1580]: I20260417 02:59:56.260129 1580 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 17 02:59:56.261308 update_engine[1580]: I20260417 02:59:56.260767 1580 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 17 02:59:56.262350 update_engine[1580]: I20260417 02:59:56.261476 1580 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 17 02:59:56.271614 update_engine[1580]: E20260417 02:59:56.271437 1580 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 17 02:59:56.277785 update_engine[1580]: I20260417 02:59:56.276027 1580 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Apr 17 02:59:56.279902 update_engine[1580]: I20260417 02:59:56.279125 1580 omaha_request_action.cc:617] Omaha request response: Apr 17 02:59:56.285623 update_engine[1580]: E20260417 02:59:56.283690 1580 omaha_request_action.cc:636] Omaha request network transfer failed. Apr 17 02:59:56.291800 update_engine[1580]: I20260417 02:59:56.290252 1580 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Apr 17 02:59:56.291800 update_engine[1580]: I20260417 02:59:56.290452 1580 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Apr 17 02:59:56.291800 update_engine[1580]: I20260417 02:59:56.290460 1580 update_attempter.cc:306] Processing Done. Apr 17 02:59:56.291800 update_engine[1580]: E20260417 02:59:56.290540 1580 update_attempter.cc:619] Update failed. Apr 17 02:59:56.291800 update_engine[1580]: I20260417 02:59:56.290634 1580 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Apr 17 02:59:56.291800 update_engine[1580]: I20260417 02:59:56.290642 1580 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Apr 17 02:59:56.291800 update_engine[1580]: I20260417 02:59:56.290688 1580 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Apr 17 02:59:56.296336 update_engine[1580]: I20260417 02:59:56.296007 1580 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Apr 17 02:59:56.296776 update_engine[1580]: I20260417 02:59:56.296669 1580 omaha_request_action.cc:271] Posting an Omaha request to disabled Apr 17 02:59:56.296776 update_engine[1580]: I20260417 02:59:56.296685 1580 omaha_request_action.cc:272] Request: Apr 17 02:59:56.296776 update_engine[1580]: Apr 17 02:59:56.296776 update_engine[1580]: Apr 17 02:59:56.296776 update_engine[1580]: Apr 17 02:59:56.296776 update_engine[1580]: Apr 17 02:59:56.296776 update_engine[1580]: Apr 17 02:59:56.296776 update_engine[1580]: Apr 17 02:59:56.296776 update_engine[1580]: I20260417 02:59:56.296695 1580 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 17 02:59:56.300377 update_engine[1580]: I20260417 02:59:56.296901 1580 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 17 02:59:56.310728 locksmithd[1626]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Apr 17 02:59:56.314364 update_engine[1580]: I20260417 02:59:56.311363 1580 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 17 02:59:56.317407 update_engine[1580]: E20260417 02:59:56.316241 1580 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 17 02:59:56.320930 update_engine[1580]: I20260417 02:59:56.318885 1580 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Apr 17 02:59:56.320930 update_engine[1580]: I20260417 02:59:56.319675 1580 omaha_request_action.cc:617] Omaha request response: Apr 17 02:59:56.320930 update_engine[1580]: I20260417 02:59:56.319768 1580 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Apr 17 02:59:56.320930 update_engine[1580]: I20260417 02:59:56.319772 1580 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Apr 17 02:59:56.320930 update_engine[1580]: I20260417 02:59:56.319777 1580 update_attempter.cc:306] Processing Done. Apr 17 02:59:56.320930 update_engine[1580]: I20260417 02:59:56.319796 1580 update_attempter.cc:310] Error event sent. Apr 17 02:59:56.320930 update_engine[1580]: I20260417 02:59:56.319892 1580 update_check_scheduler.cc:74] Next update check in 47m24s Apr 17 02:59:56.324476 locksmithd[1626]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Apr 17 02:59:58.587928 systemd[1]: Started sshd@31-10.0.0.21:22-10.0.0.1:45088.service - OpenSSH per-connection server daemon (10.0.0.1:45088). Apr 17 02:59:58.711285 sshd[7354]: Accepted publickey for core from 10.0.0.1 port 45088 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 02:59:58.715202 sshd-session[7354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 02:59:58.728112 systemd-logind[1579]: New session 32 of user core. Apr 17 02:59:58.739779 systemd[1]: Started session-32.scope - Session 32 of User core. Apr 17 02:59:59.251231 sshd[7357]: Connection closed by 10.0.0.1 port 45088 Apr 17 02:59:59.252624 sshd-session[7354]: pam_unix(sshd:session): session closed for user core Apr 17 02:59:59.259519 systemd[1]: sshd@31-10.0.0.21:22-10.0.0.1:45088.service: Deactivated successfully. Apr 17 02:59:59.266675 systemd[1]: session-32.scope: Deactivated successfully. Apr 17 02:59:59.268175 systemd-logind[1579]: Session 32 logged out. Waiting for processes to exit. Apr 17 02:59:59.269629 systemd-logind[1579]: Removed session 32. Apr 17 03:00:04.284531 systemd[1]: Started sshd@32-10.0.0.21:22-10.0.0.1:45584.service - OpenSSH per-connection server daemon (10.0.0.1:45584). Apr 17 03:00:04.363083 sshd[7412]: Accepted publickey for core from 10.0.0.1 port 45584 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 03:00:04.368842 sshd-session[7412]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 03:00:04.461488 systemd-logind[1579]: New session 33 of user core. Apr 17 03:00:04.465509 systemd[1]: Started session-33.scope - Session 33 of User core. Apr 17 03:00:04.757789 sshd[7417]: Connection closed by 10.0.0.1 port 45584 Apr 17 03:00:04.759125 sshd-session[7412]: pam_unix(sshd:session): session closed for user core Apr 17 03:00:04.814346 systemd[1]: sshd@32-10.0.0.21:22-10.0.0.1:45584.service: Deactivated successfully. Apr 17 03:00:04.827346 systemd[1]: session-33.scope: Deactivated successfully. Apr 17 03:00:04.850532 systemd-logind[1579]: Session 33 logged out. Waiting for processes to exit. Apr 17 03:00:04.853240 systemd-logind[1579]: Removed session 33. Apr 17 03:00:09.724967 systemd[1]: Started sshd@33-10.0.0.21:22-10.0.0.1:58460.service - OpenSSH per-connection server daemon (10.0.0.1:58460). Apr 17 03:00:09.817329 sshd[7498]: Accepted publickey for core from 10.0.0.1 port 58460 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 03:00:09.820466 sshd-session[7498]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 03:00:09.839941 systemd-logind[1579]: New session 34 of user core. Apr 17 03:00:09.849683 systemd[1]: Started session-34.scope - Session 34 of User core. Apr 17 03:00:10.065697 sshd[7501]: Connection closed by 10.0.0.1 port 58460 Apr 17 03:00:10.066691 sshd-session[7498]: pam_unix(sshd:session): session closed for user core Apr 17 03:00:10.081022 systemd[1]: sshd@33-10.0.0.21:22-10.0.0.1:58460.service: Deactivated successfully. Apr 17 03:00:10.086537 systemd[1]: session-34.scope: Deactivated successfully. Apr 17 03:00:10.087856 systemd-logind[1579]: Session 34 logged out. Waiting for processes to exit. Apr 17 03:00:10.088934 systemd-logind[1579]: Removed session 34. Apr 17 03:00:12.180174 kubelet[2815]: E0417 03:00:12.180011 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 03:00:13.149460 kubelet[2815]: E0417 03:00:13.149393 2815 dns.go:154] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Apr 17 03:00:15.081503 systemd[1]: Started sshd@34-10.0.0.21:22-10.0.0.1:58468.service - OpenSSH per-connection server daemon (10.0.0.1:58468). Apr 17 03:00:15.142155 sshd[7537]: Accepted publickey for core from 10.0.0.1 port 58468 ssh2: RSA SHA256:NqGHB9rSHtMUQkJqVjWjGfsIhZEnp/FTWIhF91T30gc Apr 17 03:00:15.144357 sshd-session[7537]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 03:00:15.155343 systemd-logind[1579]: New session 35 of user core. Apr 17 03:00:15.164912 systemd[1]: Started session-35.scope - Session 35 of User core. Apr 17 03:00:15.356073 sshd[7540]: Connection closed by 10.0.0.1 port 58468 Apr 17 03:00:15.357107 sshd-session[7537]: pam_unix(sshd:session): session closed for user core Apr 17 03:00:15.370045 systemd[1]: sshd@34-10.0.0.21:22-10.0.0.1:58468.service: Deactivated successfully. Apr 17 03:00:15.380411 systemd[1]: session-35.scope: Deactivated successfully. Apr 17 03:00:15.381411 systemd-logind[1579]: Session 35 logged out. Waiting for processes to exit. Apr 17 03:00:15.383935 systemd-logind[1579]: Removed session 35.