Apr 24 23:58:27.938141 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Apr 24 22:11:38 -00 2026 Apr 24 23:58:27.938178 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=c8442747465ed99a522e07b8746f6a7817fb39c2025d7438698e3b90e9c0defb Apr 24 23:58:27.938198 kernel: BIOS-provided physical RAM map: Apr 24 23:58:27.938209 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Apr 24 23:58:27.938220 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000786cdfff] usable Apr 24 23:58:27.938231 kernel: BIOS-e820: [mem 0x00000000786ce000-0x00000000787cdfff] type 20 Apr 24 23:58:27.938894 kernel: BIOS-e820: [mem 0x00000000787ce000-0x000000007894dfff] reserved Apr 24 23:58:27.938912 kernel: BIOS-e820: [mem 0x000000007894e000-0x000000007895dfff] ACPI data Apr 24 23:58:27.938925 kernel: BIOS-e820: [mem 0x000000007895e000-0x00000000789ddfff] ACPI NVS Apr 24 23:58:27.938943 kernel: BIOS-e820: [mem 0x00000000789de000-0x000000007c97bfff] usable Apr 24 23:58:27.938956 kernel: BIOS-e820: [mem 0x000000007c97c000-0x000000007c9fffff] reserved Apr 24 23:58:27.938966 kernel: NX (Execute Disable) protection: active Apr 24 23:58:27.938977 kernel: APIC: Static calls initialized Apr 24 23:58:27.938990 kernel: efi: EFI v2.7 by EDK II Apr 24 23:58:27.939005 kernel: efi: SMBIOS=0x7886a000 ACPI=0x7895d000 ACPI 2.0=0x7895d014 MEMATTR=0x7701a018 Apr 24 23:58:27.939022 kernel: SMBIOS 2.7 present. Apr 24 23:58:27.939035 kernel: DMI: Amazon EC2 t3.small/, BIOS 1.0 10/16/2017 Apr 24 23:58:27.939139 kernel: Hypervisor detected: KVM Apr 24 23:58:27.939153 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Apr 24 23:58:27.939167 kernel: kvm-clock: using sched offset of 3811094043 cycles Apr 24 23:58:27.939181 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Apr 24 23:58:27.939195 kernel: tsc: Detected 2499.996 MHz processor Apr 24 23:58:27.939209 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Apr 24 23:58:27.939223 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Apr 24 23:58:27.939237 kernel: last_pfn = 0x7c97c max_arch_pfn = 0x400000000 Apr 24 23:58:27.939255 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Apr 24 23:58:27.939269 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Apr 24 23:58:27.939284 kernel: Using GB pages for direct mapping Apr 24 23:58:27.939297 kernel: Secure boot disabled Apr 24 23:58:27.939311 kernel: ACPI: Early table checksum verification disabled Apr 24 23:58:27.939324 kernel: ACPI: RSDP 0x000000007895D014 000024 (v02 AMAZON) Apr 24 23:58:27.939338 kernel: ACPI: XSDT 0x000000007895C0E8 00006C (v01 AMAZON AMZNFACP 00000001 01000013) Apr 24 23:58:27.939352 kernel: ACPI: FACP 0x0000000078955000 000114 (v01 AMAZON AMZNFACP 00000001 AMZN 00000001) Apr 24 23:58:27.939366 kernel: ACPI: DSDT 0x0000000078956000 00115A (v01 AMAZON AMZNDSDT 00000001 AMZN 00000001) Apr 24 23:58:27.939383 kernel: ACPI: FACS 0x00000000789D0000 000040 Apr 24 23:58:27.939398 kernel: ACPI: WAET 0x000000007895B000 000028 (v01 AMAZON AMZNWAET 00000001 AMZN 00000001) Apr 24 23:58:27.939411 kernel: ACPI: SLIT 0x000000007895A000 00006C (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Apr 24 23:58:27.939425 kernel: ACPI: APIC 0x0000000078959000 000076 (v01 AMAZON AMZNAPIC 00000001 AMZN 00000001) Apr 24 23:58:27.939438 kernel: ACPI: SRAT 0x0000000078958000 0000A0 (v01 AMAZON AMZNSRAT 00000001 AMZN 00000001) Apr 24 23:58:27.939452 kernel: ACPI: HPET 0x0000000078954000 000038 (v01 AMAZON AMZNHPET 00000001 AMZN 00000001) Apr 24 23:58:27.939472 kernel: ACPI: SSDT 0x0000000078953000 000759 (v01 AMAZON AMZNSSDT 00000001 AMZN 00000001) Apr 24 23:58:27.939490 kernel: ACPI: SSDT 0x0000000078952000 0000D1 (v01 AMAZON AMZNSSDT 00000001 AMZN 00000001) Apr 24 23:58:27.939504 kernel: ACPI: BGRT 0x0000000078951000 000038 (v01 AMAZON AMAZON 00000002 01000013) Apr 24 23:58:27.939519 kernel: ACPI: Reserving FACP table memory at [mem 0x78955000-0x78955113] Apr 24 23:58:27.939534 kernel: ACPI: Reserving DSDT table memory at [mem 0x78956000-0x78957159] Apr 24 23:58:27.939548 kernel: ACPI: Reserving FACS table memory at [mem 0x789d0000-0x789d003f] Apr 24 23:58:27.939563 kernel: ACPI: Reserving WAET table memory at [mem 0x7895b000-0x7895b027] Apr 24 23:58:27.939577 kernel: ACPI: Reserving SLIT table memory at [mem 0x7895a000-0x7895a06b] Apr 24 23:58:27.939595 kernel: ACPI: Reserving APIC table memory at [mem 0x78959000-0x78959075] Apr 24 23:58:27.939609 kernel: ACPI: Reserving SRAT table memory at [mem 0x78958000-0x7895809f] Apr 24 23:58:27.939624 kernel: ACPI: Reserving HPET table memory at [mem 0x78954000-0x78954037] Apr 24 23:58:27.939637 kernel: ACPI: Reserving SSDT table memory at [mem 0x78953000-0x78953758] Apr 24 23:58:27.939650 kernel: ACPI: Reserving SSDT table memory at [mem 0x78952000-0x789520d0] Apr 24 23:58:27.939661 kernel: ACPI: Reserving BGRT table memory at [mem 0x78951000-0x78951037] Apr 24 23:58:27.939673 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Apr 24 23:58:27.939686 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Apr 24 23:58:27.939699 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x7fffffff] Apr 24 23:58:27.939716 kernel: NUMA: Initialized distance table, cnt=1 Apr 24 23:58:27.939729 kernel: NODE_DATA(0) allocated [mem 0x7a8f0000-0x7a8f5fff] Apr 24 23:58:27.939743 kernel: Zone ranges: Apr 24 23:58:27.939757 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Apr 24 23:58:27.939770 kernel: DMA32 [mem 0x0000000001000000-0x000000007c97bfff] Apr 24 23:58:27.939803 kernel: Normal empty Apr 24 23:58:27.939817 kernel: Movable zone start for each node Apr 24 23:58:27.939831 kernel: Early memory node ranges Apr 24 23:58:27.939844 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Apr 24 23:58:27.939857 kernel: node 0: [mem 0x0000000000100000-0x00000000786cdfff] Apr 24 23:58:27.939874 kernel: node 0: [mem 0x00000000789de000-0x000000007c97bfff] Apr 24 23:58:27.939888 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007c97bfff] Apr 24 23:58:27.939902 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Apr 24 23:58:27.939915 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Apr 24 23:58:27.939929 kernel: On node 0, zone DMA32: 784 pages in unavailable ranges Apr 24 23:58:27.939943 kernel: On node 0, zone DMA32: 13956 pages in unavailable ranges Apr 24 23:58:27.939956 kernel: ACPI: PM-Timer IO Port: 0xb008 Apr 24 23:58:27.939970 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Apr 24 23:58:27.939983 kernel: IOAPIC[0]: apic_id 0, version 32, address 0xfec00000, GSI 0-23 Apr 24 23:58:27.939999 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Apr 24 23:58:27.940013 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Apr 24 23:58:27.940027 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Apr 24 23:58:27.940040 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Apr 24 23:58:27.940054 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Apr 24 23:58:27.940068 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Apr 24 23:58:27.940081 kernel: TSC deadline timer available Apr 24 23:58:27.940094 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Apr 24 23:58:27.940108 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Apr 24 23:58:27.940124 kernel: [mem 0x7ca00000-0xffffffff] available for PCI devices Apr 24 23:58:27.940138 kernel: Booting paravirtualized kernel on KVM Apr 24 23:58:27.940152 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Apr 24 23:58:27.940165 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Apr 24 23:58:27.940179 kernel: percpu: Embedded 57 pages/cpu s196328 r8192 d28952 u1048576 Apr 24 23:58:27.940192 kernel: pcpu-alloc: s196328 r8192 d28952 u1048576 alloc=1*2097152 Apr 24 23:58:27.940204 kernel: pcpu-alloc: [0] 0 1 Apr 24 23:58:27.940217 kernel: kvm-guest: PV spinlocks enabled Apr 24 23:58:27.940231 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Apr 24 23:58:27.940250 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=c8442747465ed99a522e07b8746f6a7817fb39c2025d7438698e3b90e9c0defb Apr 24 23:58:27.940264 kernel: random: crng init done Apr 24 23:58:27.940277 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Apr 24 23:58:27.940291 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Apr 24 23:58:27.940304 kernel: Fallback order for Node 0: 0 Apr 24 23:58:27.940317 kernel: Built 1 zonelists, mobility grouping on. Total pages: 501318 Apr 24 23:58:27.940330 kernel: Policy zone: DMA32 Apr 24 23:58:27.940343 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Apr 24 23:58:27.940360 kernel: Memory: 1874640K/2037804K available (12288K kernel code, 2288K rwdata, 22752K rodata, 42896K init, 2300K bss, 162904K reserved, 0K cma-reserved) Apr 24 23:58:27.940373 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Apr 24 23:58:27.940386 kernel: Kernel/User page tables isolation: enabled Apr 24 23:58:27.940400 kernel: ftrace: allocating 37996 entries in 149 pages Apr 24 23:58:27.940413 kernel: ftrace: allocated 149 pages with 4 groups Apr 24 23:58:27.940426 kernel: Dynamic Preempt: voluntary Apr 24 23:58:27.940440 kernel: rcu: Preemptible hierarchical RCU implementation. Apr 24 23:58:27.940454 kernel: rcu: RCU event tracing is enabled. Apr 24 23:58:27.940468 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Apr 24 23:58:27.940485 kernel: Trampoline variant of Tasks RCU enabled. Apr 24 23:58:27.940499 kernel: Rude variant of Tasks RCU enabled. Apr 24 23:58:27.940512 kernel: Tracing variant of Tasks RCU enabled. Apr 24 23:58:27.940526 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Apr 24 23:58:27.940540 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Apr 24 23:58:27.940553 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Apr 24 23:58:27.940567 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Apr 24 23:58:27.940594 kernel: Console: colour dummy device 80x25 Apr 24 23:58:27.940608 kernel: printk: console [tty0] enabled Apr 24 23:58:27.940622 kernel: printk: console [ttyS0] enabled Apr 24 23:58:27.940636 kernel: ACPI: Core revision 20230628 Apr 24 23:58:27.940650 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 30580167144 ns Apr 24 23:58:27.940669 kernel: APIC: Switch to symmetric I/O mode setup Apr 24 23:58:27.940683 kernel: x2apic enabled Apr 24 23:58:27.940697 kernel: APIC: Switched APIC routing to: physical x2apic Apr 24 23:58:27.940712 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x24093623c91, max_idle_ns: 440795291220 ns Apr 24 23:58:27.940727 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499996) Apr 24 23:58:27.940744 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Apr 24 23:58:27.940758 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Apr 24 23:58:27.940772 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Apr 24 23:58:27.940884 kernel: Spectre V2 : Mitigation: Retpolines Apr 24 23:58:27.940899 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Apr 24 23:58:27.940913 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Apr 24 23:58:27.940928 kernel: RETBleed: Vulnerable Apr 24 23:58:27.940942 kernel: Speculative Store Bypass: Vulnerable Apr 24 23:58:27.940956 kernel: MDS: Vulnerable: Clear CPU buffers attempted, no microcode Apr 24 23:58:27.940970 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Apr 24 23:58:27.940987 kernel: GDS: Unknown: Dependent on hypervisor status Apr 24 23:58:27.941001 kernel: active return thunk: its_return_thunk Apr 24 23:58:27.941016 kernel: ITS: Mitigation: Aligned branch/return thunks Apr 24 23:58:27.941030 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Apr 24 23:58:27.941044 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Apr 24 23:58:27.941058 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Apr 24 23:58:27.941073 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Apr 24 23:58:27.941087 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Apr 24 23:58:27.941102 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Apr 24 23:58:27.941116 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Apr 24 23:58:27.941131 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Apr 24 23:58:27.941148 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Apr 24 23:58:27.941162 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Apr 24 23:58:27.941176 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Apr 24 23:58:27.941191 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Apr 24 23:58:27.941205 kernel: x86/fpu: xstate_offset[5]: 960, xstate_sizes[5]: 64 Apr 24 23:58:27.941219 kernel: x86/fpu: xstate_offset[6]: 1024, xstate_sizes[6]: 512 Apr 24 23:58:27.941233 kernel: x86/fpu: xstate_offset[7]: 1536, xstate_sizes[7]: 1024 Apr 24 23:58:27.941247 kernel: x86/fpu: xstate_offset[9]: 2560, xstate_sizes[9]: 8 Apr 24 23:58:27.941261 kernel: x86/fpu: Enabled xstate features 0x2ff, context size is 2568 bytes, using 'compacted' format. Apr 24 23:58:27.941287 kernel: Freeing SMP alternatives memory: 32K Apr 24 23:58:27.941301 kernel: pid_max: default: 32768 minimum: 301 Apr 24 23:58:27.941320 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Apr 24 23:58:27.941335 kernel: landlock: Up and running. Apr 24 23:58:27.941350 kernel: SELinux: Initializing. Apr 24 23:58:27.941366 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Apr 24 23:58:27.941381 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Apr 24 23:58:27.941396 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz (family: 0x6, model: 0x55, stepping: 0x7) Apr 24 23:58:27.941412 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 24 23:58:27.941427 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 24 23:58:27.941443 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 24 23:58:27.941459 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Apr 24 23:58:27.941478 kernel: signal: max sigframe size: 3632 Apr 24 23:58:27.941494 kernel: rcu: Hierarchical SRCU implementation. Apr 24 23:58:27.941509 kernel: rcu: Max phase no-delay instances is 400. Apr 24 23:58:27.941525 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Apr 24 23:58:27.941541 kernel: smp: Bringing up secondary CPUs ... Apr 24 23:58:27.941556 kernel: smpboot: x86: Booting SMP configuration: Apr 24 23:58:27.941571 kernel: .... node #0, CPUs: #1 Apr 24 23:58:27.941587 kernel: MDS CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/mds.html for more details. Apr 24 23:58:27.941603 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Apr 24 23:58:27.941622 kernel: smp: Brought up 1 node, 2 CPUs Apr 24 23:58:27.941637 kernel: smpboot: Max logical packages: 1 Apr 24 23:58:27.941653 kernel: smpboot: Total of 2 processors activated (9999.98 BogoMIPS) Apr 24 23:58:27.941669 kernel: devtmpfs: initialized Apr 24 23:58:27.941684 kernel: x86/mm: Memory block size: 128MB Apr 24 23:58:27.941700 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7895e000-0x789ddfff] (524288 bytes) Apr 24 23:58:27.941716 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Apr 24 23:58:27.941731 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Apr 24 23:58:27.941747 kernel: pinctrl core: initialized pinctrl subsystem Apr 24 23:58:27.941765 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Apr 24 23:58:27.941781 kernel: audit: initializing netlink subsys (disabled) Apr 24 23:58:27.941823 kernel: audit: type=2000 audit(1777075106.924:1): state=initialized audit_enabled=0 res=1 Apr 24 23:58:27.941838 kernel: thermal_sys: Registered thermal governor 'step_wise' Apr 24 23:58:27.941854 kernel: thermal_sys: Registered thermal governor 'user_space' Apr 24 23:58:27.941869 kernel: cpuidle: using governor menu Apr 24 23:58:27.941885 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Apr 24 23:58:27.941900 kernel: dca service started, version 1.12.1 Apr 24 23:58:27.941916 kernel: PCI: Using configuration type 1 for base access Apr 24 23:58:27.941935 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Apr 24 23:58:27.941950 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Apr 24 23:58:27.941966 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Apr 24 23:58:27.941982 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Apr 24 23:58:27.941997 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Apr 24 23:58:27.942013 kernel: ACPI: Added _OSI(Module Device) Apr 24 23:58:27.942028 kernel: ACPI: Added _OSI(Processor Device) Apr 24 23:58:27.942043 kernel: ACPI: Added _OSI(Processor Aggregator Device) Apr 24 23:58:27.942057 kernel: ACPI: 3 ACPI AML tables successfully acquired and loaded Apr 24 23:58:27.942076 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Apr 24 23:58:27.942092 kernel: ACPI: Interpreter enabled Apr 24 23:58:27.942107 kernel: ACPI: PM: (supports S0 S5) Apr 24 23:58:27.942122 kernel: ACPI: Using IOAPIC for interrupt routing Apr 24 23:58:27.942139 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Apr 24 23:58:27.942153 kernel: PCI: Using E820 reservations for host bridge windows Apr 24 23:58:27.942166 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Apr 24 23:58:27.942182 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Apr 24 23:58:27.942409 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Apr 24 23:58:27.942562 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Apr 24 23:58:27.942694 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Apr 24 23:58:27.942713 kernel: acpiphp: Slot [3] registered Apr 24 23:58:27.942730 kernel: acpiphp: Slot [4] registered Apr 24 23:58:27.942746 kernel: acpiphp: Slot [5] registered Apr 24 23:58:27.942763 kernel: acpiphp: Slot [6] registered Apr 24 23:58:27.942779 kernel: acpiphp: Slot [7] registered Apr 24 23:58:27.942817 kernel: acpiphp: Slot [8] registered Apr 24 23:58:27.942833 kernel: acpiphp: Slot [9] registered Apr 24 23:58:27.942850 kernel: acpiphp: Slot [10] registered Apr 24 23:58:27.942866 kernel: acpiphp: Slot [11] registered Apr 24 23:58:27.942882 kernel: acpiphp: Slot [12] registered Apr 24 23:58:27.942898 kernel: acpiphp: Slot [13] registered Apr 24 23:58:27.942913 kernel: acpiphp: Slot [14] registered Apr 24 23:58:27.942929 kernel: acpiphp: Slot [15] registered Apr 24 23:58:27.942944 kernel: acpiphp: Slot [16] registered Apr 24 23:58:27.942959 kernel: acpiphp: Slot [17] registered Apr 24 23:58:27.942977 kernel: acpiphp: Slot [18] registered Apr 24 23:58:27.942993 kernel: acpiphp: Slot [19] registered Apr 24 23:58:27.943008 kernel: acpiphp: Slot [20] registered Apr 24 23:58:27.943023 kernel: acpiphp: Slot [21] registered Apr 24 23:58:27.943039 kernel: acpiphp: Slot [22] registered Apr 24 23:58:27.943062 kernel: acpiphp: Slot [23] registered Apr 24 23:58:27.943078 kernel: acpiphp: Slot [24] registered Apr 24 23:58:27.943093 kernel: acpiphp: Slot [25] registered Apr 24 23:58:27.943108 kernel: acpiphp: Slot [26] registered Apr 24 23:58:27.943127 kernel: acpiphp: Slot [27] registered Apr 24 23:58:27.943142 kernel: acpiphp: Slot [28] registered Apr 24 23:58:27.943157 kernel: acpiphp: Slot [29] registered Apr 24 23:58:27.943172 kernel: acpiphp: Slot [30] registered Apr 24 23:58:27.943187 kernel: acpiphp: Slot [31] registered Apr 24 23:58:27.943202 kernel: PCI host bridge to bus 0000:00 Apr 24 23:58:27.943345 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Apr 24 23:58:27.943462 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Apr 24 23:58:27.943580 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Apr 24 23:58:27.943692 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xfebfffff window] Apr 24 23:58:27.943817 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x2000ffffffff window] Apr 24 23:58:27.943931 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Apr 24 23:58:27.944075 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Apr 24 23:58:27.944212 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Apr 24 23:58:27.944355 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x000000 Apr 24 23:58:27.944502 kernel: pci 0000:00:01.3: quirk: [io 0xb000-0xb03f] claimed by PIIX4 ACPI Apr 24 23:58:27.944658 kernel: pci 0000:00:01.3: PIIX4 devres E PIO at fff0-ffff Apr 24 23:58:27.944806 kernel: pci 0000:00:01.3: PIIX4 devres F MMIO at ffc00000-ffffffff Apr 24 23:58:27.944944 kernel: pci 0000:00:01.3: PIIX4 devres G PIO at fff0-ffff Apr 24 23:58:27.945079 kernel: pci 0000:00:01.3: PIIX4 devres H MMIO at ffc00000-ffffffff Apr 24 23:58:27.945211 kernel: pci 0000:00:01.3: PIIX4 devres I PIO at fff0-ffff Apr 24 23:58:27.945343 kernel: pci 0000:00:01.3: PIIX4 devres J PIO at fff0-ffff Apr 24 23:58:27.945487 kernel: pci 0000:00:03.0: [1d0f:1111] type 00 class 0x030000 Apr 24 23:58:27.945623 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x80000000-0x803fffff pref] Apr 24 23:58:27.945756 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xffff0000-0xffffffff pref] Apr 24 23:58:27.945905 kernel: pci 0000:00:03.0: BAR 0: assigned to efifb Apr 24 23:58:27.946047 kernel: pci 0000:00:03.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Apr 24 23:58:27.946195 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 Apr 24 23:58:27.946340 kernel: pci 0000:00:04.0: reg 0x10: [mem 0x80404000-0x80407fff] Apr 24 23:58:27.946486 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 Apr 24 23:58:27.946622 kernel: pci 0000:00:05.0: reg 0x10: [mem 0x80400000-0x80403fff] Apr 24 23:58:27.946643 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Apr 24 23:58:27.946661 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Apr 24 23:58:27.946677 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Apr 24 23:58:27.946693 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Apr 24 23:58:27.946710 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Apr 24 23:58:27.946731 kernel: iommu: Default domain type: Translated Apr 24 23:58:27.946747 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Apr 24 23:58:27.946763 kernel: efivars: Registered efivars operations Apr 24 23:58:27.946779 kernel: PCI: Using ACPI for IRQ routing Apr 24 23:58:27.947471 kernel: PCI: pci_cache_line_size set to 64 bytes Apr 24 23:58:27.947488 kernel: e820: reserve RAM buffer [mem 0x786ce000-0x7bffffff] Apr 24 23:58:27.947503 kernel: e820: reserve RAM buffer [mem 0x7c97c000-0x7fffffff] Apr 24 23:58:27.947686 kernel: pci 0000:00:03.0: vgaarb: setting as boot VGA device Apr 24 23:58:27.947842 kernel: pci 0000:00:03.0: vgaarb: bridge control possible Apr 24 23:58:27.947983 kernel: pci 0000:00:03.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Apr 24 23:58:27.948002 kernel: vgaarb: loaded Apr 24 23:58:27.948017 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0 Apr 24 23:58:27.948032 kernel: hpet0: 8 comparators, 32-bit 62.500000 MHz counter Apr 24 23:58:27.948046 kernel: clocksource: Switched to clocksource kvm-clock Apr 24 23:58:27.948061 kernel: VFS: Disk quotas dquot_6.6.0 Apr 24 23:58:27.948074 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Apr 24 23:58:27.948089 kernel: pnp: PnP ACPI init Apr 24 23:58:27.948103 kernel: pnp: PnP ACPI: found 5 devices Apr 24 23:58:27.948122 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Apr 24 23:58:27.948137 kernel: NET: Registered PF_INET protocol family Apr 24 23:58:27.948151 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Apr 24 23:58:27.948165 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Apr 24 23:58:27.948180 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Apr 24 23:58:27.948194 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Apr 24 23:58:27.948209 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Apr 24 23:58:27.948224 kernel: TCP: Hash tables configured (established 16384 bind 16384) Apr 24 23:58:27.948238 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Apr 24 23:58:27.948256 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Apr 24 23:58:27.948271 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Apr 24 23:58:27.948285 kernel: NET: Registered PF_XDP protocol family Apr 24 23:58:27.948407 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Apr 24 23:58:27.948528 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Apr 24 23:58:27.948644 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Apr 24 23:58:27.948763 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xfebfffff window] Apr 24 23:58:27.948893 kernel: pci_bus 0000:00: resource 8 [mem 0x100000000-0x2000ffffffff window] Apr 24 23:58:27.949034 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Apr 24 23:58:27.949053 kernel: PCI: CLS 0 bytes, default 64 Apr 24 23:58:27.949067 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Apr 24 23:58:27.949082 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x24093623c91, max_idle_ns: 440795291220 ns Apr 24 23:58:27.949097 kernel: clocksource: Switched to clocksource tsc Apr 24 23:58:27.949111 kernel: Initialise system trusted keyrings Apr 24 23:58:27.949126 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Apr 24 23:58:27.949141 kernel: Key type asymmetric registered Apr 24 23:58:27.949158 kernel: Asymmetric key parser 'x509' registered Apr 24 23:58:27.949172 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Apr 24 23:58:27.949187 kernel: io scheduler mq-deadline registered Apr 24 23:58:27.949201 kernel: io scheduler kyber registered Apr 24 23:58:27.949215 kernel: io scheduler bfq registered Apr 24 23:58:27.949231 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Apr 24 23:58:27.949246 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Apr 24 23:58:27.949260 kernel: 00:04: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Apr 24 23:58:27.949274 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Apr 24 23:58:27.949290 kernel: i8042: Warning: Keylock active Apr 24 23:58:27.949306 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Apr 24 23:58:27.949320 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Apr 24 23:58:27.949458 kernel: rtc_cmos 00:00: RTC can wake from S4 Apr 24 23:58:27.949581 kernel: rtc_cmos 00:00: registered as rtc0 Apr 24 23:58:27.949727 kernel: rtc_cmos 00:00: setting system clock to 2026-04-24T23:58:27 UTC (1777075107) Apr 24 23:58:27.949899 kernel: rtc_cmos 00:00: alarms up to one day, 114 bytes nvram Apr 24 23:58:27.949919 kernel: intel_pstate: CPU model not supported Apr 24 23:58:27.949940 kernel: efifb: probing for efifb Apr 24 23:58:27.949957 kernel: efifb: framebuffer at 0x80000000, using 1920k, total 1920k Apr 24 23:58:27.949972 kernel: efifb: mode is 800x600x32, linelength=3200, pages=1 Apr 24 23:58:27.949987 kernel: efifb: scrolling: redraw Apr 24 23:58:27.950006 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Apr 24 23:58:27.950037 kernel: Console: switching to colour frame buffer device 100x37 Apr 24 23:58:27.950069 kernel: fb0: EFI VGA frame buffer device Apr 24 23:58:27.950088 kernel: pstore: Using crash dump compression: deflate Apr 24 23:58:27.950103 kernel: pstore: Registered efi_pstore as persistent store backend Apr 24 23:58:27.950121 kernel: NET: Registered PF_INET6 protocol family Apr 24 23:58:27.950137 kernel: Segment Routing with IPv6 Apr 24 23:58:27.950153 kernel: In-situ OAM (IOAM) with IPv6 Apr 24 23:58:27.950166 kernel: NET: Registered PF_PACKET protocol family Apr 24 23:58:27.950180 kernel: Key type dns_resolver registered Apr 24 23:58:27.950196 kernel: IPI shorthand broadcast: enabled Apr 24 23:58:27.950239 kernel: sched_clock: Marking stable (481002666, 130276010)->(680835047, -69556371) Apr 24 23:58:27.950258 kernel: registered taskstats version 1 Apr 24 23:58:27.950274 kernel: Loading compiled-in X.509 certificates Apr 24 23:58:27.950288 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: 507f116e6718ec7535b55c873de10edf9b6fe124' Apr 24 23:58:27.950317 kernel: Key type .fscrypt registered Apr 24 23:58:27.950330 kernel: Key type fscrypt-provisioning registered Apr 24 23:58:27.950343 kernel: ima: No TPM chip found, activating TPM-bypass! Apr 24 23:58:27.950357 kernel: ima: Allocated hash algorithm: sha1 Apr 24 23:58:27.950370 kernel: ima: No architecture policies found Apr 24 23:58:27.950386 kernel: clk: Disabling unused clocks Apr 24 23:58:27.950402 kernel: Freeing unused kernel image (initmem) memory: 42896K Apr 24 23:58:27.950416 kernel: Write protecting the kernel read-only data: 36864k Apr 24 23:58:27.950430 kernel: Freeing unused kernel image (rodata/data gap) memory: 1824K Apr 24 23:58:27.950449 kernel: Run /init as init process Apr 24 23:58:27.950464 kernel: with arguments: Apr 24 23:58:27.950481 kernel: /init Apr 24 23:58:27.950497 kernel: with environment: Apr 24 23:58:27.950510 kernel: HOME=/ Apr 24 23:58:27.950525 kernel: TERM=linux Apr 24 23:58:27.950546 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 24 23:58:27.950568 systemd[1]: Detected virtualization amazon. Apr 24 23:58:27.950590 systemd[1]: Detected architecture x86-64. Apr 24 23:58:27.950607 systemd[1]: Running in initrd. Apr 24 23:58:27.950626 systemd[1]: No hostname configured, using default hostname. Apr 24 23:58:27.950643 systemd[1]: Hostname set to . Apr 24 23:58:27.950662 systemd[1]: Initializing machine ID from VM UUID. Apr 24 23:58:27.950681 systemd[1]: Queued start job for default target initrd.target. Apr 24 23:58:27.950698 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 24 23:58:27.950717 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 24 23:58:27.950740 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Apr 24 23:58:27.950758 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 24 23:58:27.950777 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Apr 24 23:58:27.950818 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Apr 24 23:58:27.950839 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Apr 24 23:58:27.950855 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Apr 24 23:58:27.950874 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 24 23:58:27.950892 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 24 23:58:27.950911 systemd[1]: Reached target paths.target - Path Units. Apr 24 23:58:27.950930 systemd[1]: Reached target slices.target - Slice Units. Apr 24 23:58:27.950948 systemd[1]: Reached target swap.target - Swaps. Apr 24 23:58:27.950967 systemd[1]: Reached target timers.target - Timer Units. Apr 24 23:58:27.950988 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Apr 24 23:58:27.951005 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 24 23:58:27.951021 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Apr 24 23:58:27.951038 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Apr 24 23:58:27.951069 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 24 23:58:27.951087 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 24 23:58:27.951104 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 24 23:58:27.951121 systemd[1]: Reached target sockets.target - Socket Units. Apr 24 23:58:27.951142 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Apr 24 23:58:27.951160 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 24 23:58:27.951177 systemd[1]: Finished network-cleanup.service - Network Cleanup. Apr 24 23:58:27.951195 systemd[1]: Starting systemd-fsck-usr.service... Apr 24 23:58:27.951212 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 24 23:58:27.951228 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 24 23:58:27.951246 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:58:27.951296 systemd-journald[179]: Collecting audit messages is disabled. Apr 24 23:58:27.951339 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Apr 24 23:58:27.951356 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 24 23:58:27.951375 systemd-journald[179]: Journal started Apr 24 23:58:27.951413 systemd-journald[179]: Runtime Journal (/run/log/journal/ec2179847255351668174b73631b2a2c) is 4.7M, max 38.2M, 33.4M free. Apr 24 23:58:27.950350 systemd-modules-load[180]: Inserted module 'overlay' Apr 24 23:58:27.956816 systemd[1]: Started systemd-journald.service - Journal Service. Apr 24 23:58:27.961033 systemd[1]: Finished systemd-fsck-usr.service. Apr 24 23:58:27.974047 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 24 23:58:27.978965 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 24 23:58:27.982280 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:58:27.993255 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 24 23:58:27.997189 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 24 23:58:28.004837 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Apr 24 23:58:28.009932 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Apr 24 23:58:28.013399 systemd-modules-load[180]: Inserted module 'br_netfilter' Apr 24 23:58:28.016106 kernel: Bridge firewalling registered Apr 24 23:58:28.019015 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 24 23:58:28.021489 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 24 23:58:28.024472 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 24 23:58:28.028289 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 24 23:58:28.036064 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 24 23:58:28.043503 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:58:28.048682 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Apr 24 23:58:28.050379 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 24 23:58:28.056956 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 24 23:58:28.068500 dracut-cmdline[211]: dracut-dracut-053 Apr 24 23:58:28.073025 dracut-cmdline[211]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=c8442747465ed99a522e07b8746f6a7817fb39c2025d7438698e3b90e9c0defb Apr 24 23:58:28.107576 systemd-resolved[214]: Positive Trust Anchors: Apr 24 23:58:28.107599 systemd-resolved[214]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 24 23:58:28.107658 systemd-resolved[214]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 24 23:58:28.116047 systemd-resolved[214]: Defaulting to hostname 'linux'. Apr 24 23:58:28.119215 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 24 23:58:28.119937 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 24 23:58:28.164825 kernel: SCSI subsystem initialized Apr 24 23:58:28.174817 kernel: Loading iSCSI transport class v2.0-870. Apr 24 23:58:28.185813 kernel: iscsi: registered transport (tcp) Apr 24 23:58:28.207991 kernel: iscsi: registered transport (qla4xxx) Apr 24 23:58:28.208078 kernel: QLogic iSCSI HBA Driver Apr 24 23:58:28.246637 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Apr 24 23:58:28.255000 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Apr 24 23:58:28.280516 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Apr 24 23:58:28.280598 kernel: device-mapper: uevent: version 1.0.3 Apr 24 23:58:28.280621 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Apr 24 23:58:28.323843 kernel: raid6: avx512x4 gen() 18013 MB/s Apr 24 23:58:28.341824 kernel: raid6: avx512x2 gen() 17935 MB/s Apr 24 23:58:28.359814 kernel: raid6: avx512x1 gen() 17869 MB/s Apr 24 23:58:28.377811 kernel: raid6: avx2x4 gen() 17948 MB/s Apr 24 23:58:28.395818 kernel: raid6: avx2x2 gen() 17737 MB/s Apr 24 23:58:28.414040 kernel: raid6: avx2x1 gen() 13721 MB/s Apr 24 23:58:28.414088 kernel: raid6: using algorithm avx512x4 gen() 18013 MB/s Apr 24 23:58:28.432989 kernel: raid6: .... xor() 7847 MB/s, rmw enabled Apr 24 23:58:28.433033 kernel: raid6: using avx512x2 recovery algorithm Apr 24 23:58:28.454828 kernel: xor: automatically using best checksumming function avx Apr 24 23:58:28.613817 kernel: Btrfs loaded, zoned=no, fsverity=no Apr 24 23:58:28.624581 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Apr 24 23:58:28.630004 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 24 23:58:28.650733 systemd-udevd[397]: Using default interface naming scheme 'v255'. Apr 24 23:58:28.655881 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 24 23:58:28.662593 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Apr 24 23:58:28.682909 dracut-pre-trigger[402]: rd.md=0: removing MD RAID activation Apr 24 23:58:28.712538 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Apr 24 23:58:28.717011 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 24 23:58:28.769706 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 24 23:58:28.778026 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Apr 24 23:58:28.803523 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Apr 24 23:58:28.806140 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Apr 24 23:58:28.808064 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 24 23:58:28.809210 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 24 23:58:28.816028 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Apr 24 23:58:28.850773 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Apr 24 23:58:28.879866 kernel: cryptd: max_cpu_qlen set to 1000 Apr 24 23:58:28.894942 kernel: ena 0000:00:05.0: ENA device version: 0.10 Apr 24 23:58:28.895231 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Apr 24 23:58:28.901322 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 24 23:58:28.904099 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:58:28.906031 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 24 23:58:28.906596 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 24 23:58:28.907940 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:58:28.908765 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:58:28.928432 kernel: ena 0000:00:05.0: LLQ is not supported Fallback to host mode policy. Apr 24 23:58:28.928673 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80400000, mac addr 06:d8:cf:33:cc:dd Apr 24 23:58:28.929905 kernel: AVX2 version of gcm_enc/dec engaged. Apr 24 23:58:28.929932 kernel: AES CTR mode by8 optimization enabled Apr 24 23:58:28.921165 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:58:28.922428 (udev-worker)[450]: Network interface NamePolicy= disabled on kernel command line. Apr 24 23:58:28.941858 kernel: nvme nvme0: pci function 0000:00:04.0 Apr 24 23:58:28.946908 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Apr 24 23:58:28.958322 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:58:28.965805 kernel: nvme nvme0: 2/0/0 default/read/poll queues Apr 24 23:58:28.968427 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 24 23:58:28.977006 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Apr 24 23:58:28.977096 kernel: GPT:9289727 != 33554431 Apr 24 23:58:28.977115 kernel: GPT:Alternate GPT header not at the end of the disk. Apr 24 23:58:28.978824 kernel: GPT:9289727 != 33554431 Apr 24 23:58:28.978870 kernel: GPT: Use GNU Parted to correct GPT errors. Apr 24 23:58:28.980909 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Apr 24 23:58:28.999737 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:58:29.063822 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/nvme0n1p6 scanned by (udev-worker) (441) Apr 24 23:58:29.081409 kernel: BTRFS: device fsid 077bb4ac-fe88-409a-8f61-fdf28cadf681 devid 1 transid 31 /dev/nvme0n1p3 scanned by (udev-worker) (450) Apr 24 23:58:29.097940 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Amazon Elastic Block Store EFI-SYSTEM. Apr 24 23:58:29.153425 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Amazon Elastic Block Store USR-A. Apr 24 23:58:29.155552 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Amazon Elastic Block Store USR-A. Apr 24 23:58:29.166844 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Amazon Elastic Block Store ROOT. Apr 24 23:58:29.173612 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Apr 24 23:58:29.179962 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Apr 24 23:58:29.187361 disk-uuid[626]: Primary Header is updated. Apr 24 23:58:29.187361 disk-uuid[626]: Secondary Entries is updated. Apr 24 23:58:29.187361 disk-uuid[626]: Secondary Header is updated. Apr 24 23:58:29.193817 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Apr 24 23:58:29.201816 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Apr 24 23:58:29.208824 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Apr 24 23:58:30.209811 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Apr 24 23:58:30.211026 disk-uuid[627]: The operation has completed successfully. Apr 24 23:58:30.353265 systemd[1]: disk-uuid.service: Deactivated successfully. Apr 24 23:58:30.353391 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Apr 24 23:58:30.370012 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Apr 24 23:58:30.374679 sh[970]: Success Apr 24 23:58:30.394967 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Apr 24 23:58:30.487829 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Apr 24 23:58:30.496958 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Apr 24 23:58:30.499162 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Apr 24 23:58:30.538585 kernel: BTRFS info (device dm-0): first mount of filesystem 077bb4ac-fe88-409a-8f61-fdf28cadf681 Apr 24 23:58:30.538666 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Apr 24 23:58:30.538690 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Apr 24 23:58:30.540807 kernel: BTRFS info (device dm-0): disabling log replay at mount time Apr 24 23:58:30.543356 kernel: BTRFS info (device dm-0): using free space tree Apr 24 23:58:30.620826 kernel: BTRFS info (device dm-0): enabling ssd optimizations Apr 24 23:58:30.641640 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Apr 24 23:58:30.642912 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Apr 24 23:58:30.654052 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Apr 24 23:58:30.658009 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Apr 24 23:58:30.682269 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 926930fb-88b5-4cf4-bdd1-3374ab036b7b Apr 24 23:58:30.682341 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Apr 24 23:58:30.685371 kernel: BTRFS info (device nvme0n1p6): using free space tree Apr 24 23:58:30.701872 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Apr 24 23:58:30.713339 systemd[1]: mnt-oem.mount: Deactivated successfully. Apr 24 23:58:30.716954 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 926930fb-88b5-4cf4-bdd1-3374ab036b7b Apr 24 23:58:30.725491 systemd[1]: Finished ignition-setup.service - Ignition (setup). Apr 24 23:58:30.733077 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Apr 24 23:58:30.764328 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 24 23:58:30.772032 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 24 23:58:30.818321 systemd-networkd[1162]: lo: Link UP Apr 24 23:58:30.818333 systemd-networkd[1162]: lo: Gained carrier Apr 24 23:58:30.820073 systemd-networkd[1162]: Enumeration completed Apr 24 23:58:30.820304 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 24 23:58:30.820522 systemd-networkd[1162]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:58:30.820527 systemd-networkd[1162]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 24 23:58:30.821619 systemd[1]: Reached target network.target - Network. Apr 24 23:58:30.824351 systemd-networkd[1162]: eth0: Link UP Apr 24 23:58:30.824356 systemd-networkd[1162]: eth0: Gained carrier Apr 24 23:58:30.824368 systemd-networkd[1162]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:58:30.839894 systemd-networkd[1162]: eth0: DHCPv4 address 172.31.31.110/20, gateway 172.31.16.1 acquired from 172.31.16.1 Apr 24 23:58:30.993775 ignition[1117]: Ignition 2.19.0 Apr 24 23:58:30.993803 ignition[1117]: Stage: fetch-offline Apr 24 23:58:30.994079 ignition[1117]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:58:30.994093 ignition[1117]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Apr 24 23:58:30.996598 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Apr 24 23:58:30.994433 ignition[1117]: Ignition finished successfully Apr 24 23:58:31.002028 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Apr 24 23:58:31.018112 ignition[1170]: Ignition 2.19.0 Apr 24 23:58:31.018125 ignition[1170]: Stage: fetch Apr 24 23:58:31.018637 ignition[1170]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:58:31.018652 ignition[1170]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Apr 24 23:58:31.018774 ignition[1170]: PUT http://169.254.169.254/latest/api/token: attempt #1 Apr 24 23:58:31.027914 ignition[1170]: PUT result: OK Apr 24 23:58:31.029680 ignition[1170]: parsed url from cmdline: "" Apr 24 23:58:31.029691 ignition[1170]: no config URL provided Apr 24 23:58:31.029703 ignition[1170]: reading system config file "/usr/lib/ignition/user.ign" Apr 24 23:58:31.029723 ignition[1170]: no config at "/usr/lib/ignition/user.ign" Apr 24 23:58:31.029745 ignition[1170]: PUT http://169.254.169.254/latest/api/token: attempt #1 Apr 24 23:58:31.030297 ignition[1170]: PUT result: OK Apr 24 23:58:31.030372 ignition[1170]: GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Apr 24 23:58:31.030929 ignition[1170]: GET result: OK Apr 24 23:58:31.031022 ignition[1170]: parsing config with SHA512: a511f9405e636c13644e3c7b9f171bba47d7dae98b69947ef2fbee803d9186c5f9c1ca8f9a31ae98ca4a50a032fc102c351329de133578a23d8c672972df878f Apr 24 23:58:31.037199 unknown[1170]: fetched base config from "system" Apr 24 23:58:31.038021 ignition[1170]: fetch: fetch complete Apr 24 23:58:31.037219 unknown[1170]: fetched base config from "system" Apr 24 23:58:31.038030 ignition[1170]: fetch: fetch passed Apr 24 23:58:31.037227 unknown[1170]: fetched user config from "aws" Apr 24 23:58:31.038099 ignition[1170]: Ignition finished successfully Apr 24 23:58:31.041470 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Apr 24 23:58:31.047981 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Apr 24 23:58:31.063438 ignition[1176]: Ignition 2.19.0 Apr 24 23:58:31.063451 ignition[1176]: Stage: kargs Apr 24 23:58:31.063928 ignition[1176]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:58:31.063944 ignition[1176]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Apr 24 23:58:31.064056 ignition[1176]: PUT http://169.254.169.254/latest/api/token: attempt #1 Apr 24 23:58:31.064889 ignition[1176]: PUT result: OK Apr 24 23:58:31.068101 ignition[1176]: kargs: kargs passed Apr 24 23:58:31.068184 ignition[1176]: Ignition finished successfully Apr 24 23:58:31.069989 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Apr 24 23:58:31.073978 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Apr 24 23:58:31.090531 ignition[1182]: Ignition 2.19.0 Apr 24 23:58:31.090545 ignition[1182]: Stage: disks Apr 24 23:58:31.091022 ignition[1182]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:58:31.091108 ignition[1182]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Apr 24 23:58:31.091248 ignition[1182]: PUT http://169.254.169.254/latest/api/token: attempt #1 Apr 24 23:58:31.095726 ignition[1182]: PUT result: OK Apr 24 23:58:31.097994 ignition[1182]: disks: disks passed Apr 24 23:58:31.098068 ignition[1182]: Ignition finished successfully Apr 24 23:58:31.099594 systemd[1]: Finished ignition-disks.service - Ignition (disks). Apr 24 23:58:31.100605 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Apr 24 23:58:31.101286 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Apr 24 23:58:31.101614 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 24 23:58:31.102159 systemd[1]: Reached target sysinit.target - System Initialization. Apr 24 23:58:31.102694 systemd[1]: Reached target basic.target - Basic System. Apr 24 23:58:31.107995 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Apr 24 23:58:31.137143 systemd-fsck[1190]: ROOT: clean, 14/553520 files, 52654/553472 blocks Apr 24 23:58:31.140776 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Apr 24 23:58:31.147959 systemd[1]: Mounting sysroot.mount - /sysroot... Apr 24 23:58:31.251842 kernel: EXT4-fs (nvme0n1p9): mounted filesystem ae73d4a7-3ef8-4c50-8348-4aeb952085ba r/w with ordered data mode. Quota mode: none. Apr 24 23:58:31.252427 systemd[1]: Mounted sysroot.mount - /sysroot. Apr 24 23:58:31.253511 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Apr 24 23:58:31.271973 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 24 23:58:31.274916 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Apr 24 23:58:31.277019 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Apr 24 23:58:31.277230 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Apr 24 23:58:31.277264 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Apr 24 23:58:31.290005 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Apr 24 23:58:31.291767 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Apr 24 23:58:31.297810 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/nvme0n1p6 scanned by mount (1209) Apr 24 23:58:31.300990 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 926930fb-88b5-4cf4-bdd1-3374ab036b7b Apr 24 23:58:31.301049 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Apr 24 23:58:31.303526 kernel: BTRFS info (device nvme0n1p6): using free space tree Apr 24 23:58:31.315821 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Apr 24 23:58:31.317507 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 24 23:58:31.569341 initrd-setup-root[1233]: cut: /sysroot/etc/passwd: No such file or directory Apr 24 23:58:31.575671 initrd-setup-root[1240]: cut: /sysroot/etc/group: No such file or directory Apr 24 23:58:31.580885 initrd-setup-root[1247]: cut: /sysroot/etc/shadow: No such file or directory Apr 24 23:58:31.585478 initrd-setup-root[1254]: cut: /sysroot/etc/gshadow: No such file or directory Apr 24 23:58:31.816296 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Apr 24 23:58:31.820923 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Apr 24 23:58:31.823953 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Apr 24 23:58:31.834488 systemd[1]: sysroot-oem.mount: Deactivated successfully. Apr 24 23:58:31.837125 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem 926930fb-88b5-4cf4-bdd1-3374ab036b7b Apr 24 23:58:31.868821 ignition[1321]: INFO : Ignition 2.19.0 Apr 24 23:58:31.868821 ignition[1321]: INFO : Stage: mount Apr 24 23:58:31.868821 ignition[1321]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 24 23:58:31.868821 ignition[1321]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Apr 24 23:58:31.872162 ignition[1321]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Apr 24 23:58:31.872968 ignition[1321]: INFO : PUT result: OK Apr 24 23:58:31.878010 ignition[1321]: INFO : mount: mount passed Apr 24 23:58:31.878010 ignition[1321]: INFO : Ignition finished successfully Apr 24 23:58:31.878200 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Apr 24 23:58:31.880241 systemd[1]: Finished ignition-mount.service - Ignition (mount). Apr 24 23:58:31.885931 systemd[1]: Starting ignition-files.service - Ignition (files)... Apr 24 23:58:31.894282 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 24 23:58:31.904976 systemd-networkd[1162]: eth0: Gained IPv6LL Apr 24 23:58:31.917820 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/nvme0n1p6 scanned by mount (1334) Apr 24 23:58:31.921062 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem 926930fb-88b5-4cf4-bdd1-3374ab036b7b Apr 24 23:58:31.921129 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm Apr 24 23:58:31.923534 kernel: BTRFS info (device nvme0n1p6): using free space tree Apr 24 23:58:31.928810 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Apr 24 23:58:31.931385 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 24 23:58:31.959373 ignition[1351]: INFO : Ignition 2.19.0 Apr 24 23:58:31.959373 ignition[1351]: INFO : Stage: files Apr 24 23:58:31.960762 ignition[1351]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 24 23:58:31.960762 ignition[1351]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Apr 24 23:58:31.960762 ignition[1351]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Apr 24 23:58:31.962381 ignition[1351]: INFO : PUT result: OK Apr 24 23:58:31.965393 ignition[1351]: DEBUG : files: compiled without relabeling support, skipping Apr 24 23:58:31.975544 ignition[1351]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Apr 24 23:58:31.976374 ignition[1351]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Apr 24 23:58:32.023377 ignition[1351]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Apr 24 23:58:32.024407 ignition[1351]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Apr 24 23:58:32.024407 ignition[1351]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Apr 24 23:58:32.023929 unknown[1351]: wrote ssh authorized keys file for user: core Apr 24 23:58:32.035369 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Apr 24 23:58:32.036399 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Apr 24 23:58:32.128719 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Apr 24 23:58:32.443634 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Apr 24 23:58:32.445441 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Apr 24 23:58:32.445441 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Apr 24 23:58:32.445441 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Apr 24 23:58:32.445441 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Apr 24 23:58:32.445441 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 24 23:58:32.445441 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 24 23:58:32.445441 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 24 23:58:32.445441 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 24 23:58:32.445441 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Apr 24 23:58:32.445441 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Apr 24 23:58:32.445441 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Apr 24 23:58:32.453729 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Apr 24 23:58:32.453729 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Apr 24 23:58:32.453729 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-x86-64.raw: attempt #1 Apr 24 23:58:33.031931 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Apr 24 23:58:33.462608 ignition[1351]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Apr 24 23:58:33.462608 ignition[1351]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Apr 24 23:58:33.473851 ignition[1351]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 24 23:58:33.475499 ignition[1351]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 24 23:58:33.475499 ignition[1351]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Apr 24 23:58:33.475499 ignition[1351]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Apr 24 23:58:33.475499 ignition[1351]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Apr 24 23:58:33.475499 ignition[1351]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Apr 24 23:58:33.475499 ignition[1351]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Apr 24 23:58:33.475499 ignition[1351]: INFO : files: files passed Apr 24 23:58:33.475499 ignition[1351]: INFO : Ignition finished successfully Apr 24 23:58:33.477572 systemd[1]: Finished ignition-files.service - Ignition (files). Apr 24 23:58:33.485980 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Apr 24 23:58:33.489006 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Apr 24 23:58:33.496116 systemd[1]: ignition-quench.service: Deactivated successfully. Apr 24 23:58:33.497000 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Apr 24 23:58:33.513021 initrd-setup-root-after-ignition[1380]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 24 23:58:33.513021 initrd-setup-root-after-ignition[1380]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Apr 24 23:58:33.516750 initrd-setup-root-after-ignition[1384]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 24 23:58:33.518812 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 24 23:58:33.519645 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Apr 24 23:58:33.525064 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Apr 24 23:58:33.551843 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Apr 24 23:58:33.551976 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Apr 24 23:58:33.553167 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Apr 24 23:58:33.554322 systemd[1]: Reached target initrd.target - Initrd Default Target. Apr 24 23:58:33.555275 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Apr 24 23:58:33.561015 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Apr 24 23:58:33.574653 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 24 23:58:33.580030 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Apr 24 23:58:33.593219 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Apr 24 23:58:33.593919 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 24 23:58:33.594930 systemd[1]: Stopped target timers.target - Timer Units. Apr 24 23:58:33.595878 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Apr 24 23:58:33.596068 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 24 23:58:33.597182 systemd[1]: Stopped target initrd.target - Initrd Default Target. Apr 24 23:58:33.598022 systemd[1]: Stopped target basic.target - Basic System. Apr 24 23:58:33.598794 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Apr 24 23:58:33.599661 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Apr 24 23:58:33.600430 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Apr 24 23:58:33.601207 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Apr 24 23:58:33.601977 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Apr 24 23:58:33.602750 systemd[1]: Stopped target sysinit.target - System Initialization. Apr 24 23:58:33.603983 systemd[1]: Stopped target local-fs.target - Local File Systems. Apr 24 23:58:33.604728 systemd[1]: Stopped target swap.target - Swaps. Apr 24 23:58:33.605461 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Apr 24 23:58:33.605638 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Apr 24 23:58:33.606731 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Apr 24 23:58:33.607647 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 24 23:58:33.608333 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Apr 24 23:58:33.608478 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 24 23:58:33.609126 systemd[1]: dracut-initqueue.service: Deactivated successfully. Apr 24 23:58:33.609312 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Apr 24 23:58:33.610705 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Apr 24 23:58:33.610905 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 24 23:58:33.611708 systemd[1]: ignition-files.service: Deactivated successfully. Apr 24 23:58:33.611888 systemd[1]: Stopped ignition-files.service - Ignition (files). Apr 24 23:58:33.619141 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Apr 24 23:58:33.619974 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Apr 24 23:58:33.620216 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Apr 24 23:58:33.624082 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Apr 24 23:58:33.626420 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Apr 24 23:58:33.626680 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Apr 24 23:58:33.627676 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Apr 24 23:58:33.627908 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Apr 24 23:58:33.640796 systemd[1]: initrd-cleanup.service: Deactivated successfully. Apr 24 23:58:33.640932 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Apr 24 23:58:33.647840 ignition[1404]: INFO : Ignition 2.19.0 Apr 24 23:58:33.647840 ignition[1404]: INFO : Stage: umount Apr 24 23:58:33.649943 ignition[1404]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 24 23:58:33.649943 ignition[1404]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Apr 24 23:58:33.649943 ignition[1404]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Apr 24 23:58:33.653430 ignition[1404]: INFO : PUT result: OK Apr 24 23:58:33.656638 ignition[1404]: INFO : umount: umount passed Apr 24 23:58:33.657435 ignition[1404]: INFO : Ignition finished successfully Apr 24 23:58:33.659698 systemd[1]: ignition-mount.service: Deactivated successfully. Apr 24 23:58:33.660854 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Apr 24 23:58:33.662184 systemd[1]: ignition-disks.service: Deactivated successfully. Apr 24 23:58:33.662243 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Apr 24 23:58:33.662806 systemd[1]: ignition-kargs.service: Deactivated successfully. Apr 24 23:58:33.663898 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Apr 24 23:58:33.664438 systemd[1]: ignition-fetch.service: Deactivated successfully. Apr 24 23:58:33.664494 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Apr 24 23:58:33.665138 systemd[1]: Stopped target network.target - Network. Apr 24 23:58:33.665694 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Apr 24 23:58:33.665756 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Apr 24 23:58:33.666384 systemd[1]: Stopped target paths.target - Path Units. Apr 24 23:58:33.667768 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Apr 24 23:58:33.671855 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 24 23:58:33.672333 systemd[1]: Stopped target slices.target - Slice Units. Apr 24 23:58:33.673335 systemd[1]: Stopped target sockets.target - Socket Units. Apr 24 23:58:33.674070 systemd[1]: iscsid.socket: Deactivated successfully. Apr 24 23:58:33.674133 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Apr 24 23:58:33.674719 systemd[1]: iscsiuio.socket: Deactivated successfully. Apr 24 23:58:33.674770 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 24 23:58:33.675440 systemd[1]: ignition-setup.service: Deactivated successfully. Apr 24 23:58:33.675511 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Apr 24 23:58:33.676099 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Apr 24 23:58:33.676159 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Apr 24 23:58:33.676916 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Apr 24 23:58:33.677568 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Apr 24 23:58:33.681855 systemd[1]: sysroot-boot.mount: Deactivated successfully. Apr 24 23:58:33.681861 systemd-networkd[1162]: eth0: DHCPv6 lease lost Apr 24 23:58:33.682655 systemd[1]: systemd-resolved.service: Deactivated successfully. Apr 24 23:58:33.682832 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Apr 24 23:58:33.685976 systemd[1]: systemd-networkd.service: Deactivated successfully. Apr 24 23:58:33.686150 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Apr 24 23:58:33.687368 systemd[1]: sysroot-boot.service: Deactivated successfully. Apr 24 23:58:33.687510 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Apr 24 23:58:33.689693 systemd[1]: systemd-networkd.socket: Deactivated successfully. Apr 24 23:58:33.689760 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Apr 24 23:58:33.690548 systemd[1]: initrd-setup-root.service: Deactivated successfully. Apr 24 23:58:33.690616 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Apr 24 23:58:33.695891 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Apr 24 23:58:33.696432 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Apr 24 23:58:33.696510 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 24 23:58:33.697027 systemd[1]: systemd-sysctl.service: Deactivated successfully. Apr 24 23:58:33.697086 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Apr 24 23:58:33.697635 systemd[1]: systemd-modules-load.service: Deactivated successfully. Apr 24 23:58:33.697689 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Apr 24 23:58:33.700550 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Apr 24 23:58:33.700615 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 24 23:58:33.701316 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 24 23:58:33.715212 systemd[1]: network-cleanup.service: Deactivated successfully. Apr 24 23:58:33.715356 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Apr 24 23:58:33.719601 systemd[1]: systemd-udevd.service: Deactivated successfully. Apr 24 23:58:33.719829 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 24 23:58:33.721018 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Apr 24 23:58:33.721077 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Apr 24 23:58:33.721846 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Apr 24 23:58:33.721895 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Apr 24 23:58:33.722519 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Apr 24 23:58:33.722581 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Apr 24 23:58:33.723775 systemd[1]: dracut-cmdline.service: Deactivated successfully. Apr 24 23:58:33.723849 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Apr 24 23:58:33.724909 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 24 23:58:33.724967 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:58:33.731970 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Apr 24 23:58:33.732597 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Apr 24 23:58:33.732676 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 24 23:58:33.733366 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 24 23:58:33.733430 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:58:33.740605 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Apr 24 23:58:33.740747 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Apr 24 23:58:33.741809 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Apr 24 23:58:33.745961 systemd[1]: Starting initrd-switch-root.service - Switch Root... Apr 24 23:58:33.761638 systemd[1]: Switching root. Apr 24 23:58:33.800981 systemd-journald[179]: Journal stopped Apr 24 23:58:35.769952 systemd-journald[179]: Received SIGTERM from PID 1 (systemd). Apr 24 23:58:35.770055 kernel: SELinux: policy capability network_peer_controls=1 Apr 24 23:58:35.770085 kernel: SELinux: policy capability open_perms=1 Apr 24 23:58:35.770106 kernel: SELinux: policy capability extended_socket_class=1 Apr 24 23:58:35.770134 kernel: SELinux: policy capability always_check_network=0 Apr 24 23:58:35.770154 kernel: SELinux: policy capability cgroup_seclabel=1 Apr 24 23:58:35.770175 kernel: SELinux: policy capability nnp_nosuid_transition=1 Apr 24 23:58:35.770196 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Apr 24 23:58:35.770216 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Apr 24 23:58:35.770237 kernel: audit: type=1403 audit(1777075114.426:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Apr 24 23:58:35.770264 systemd[1]: Successfully loaded SELinux policy in 44.592ms. Apr 24 23:58:35.770308 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.819ms. Apr 24 23:58:35.770332 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 24 23:58:35.770354 systemd[1]: Detected virtualization amazon. Apr 24 23:58:35.770378 systemd[1]: Detected architecture x86-64. Apr 24 23:58:35.770400 systemd[1]: Detected first boot. Apr 24 23:58:35.770423 systemd[1]: Initializing machine ID from VM UUID. Apr 24 23:58:35.770446 zram_generator::config[1446]: No configuration found. Apr 24 23:58:35.770472 systemd[1]: Populated /etc with preset unit settings. Apr 24 23:58:35.770492 systemd[1]: initrd-switch-root.service: Deactivated successfully. Apr 24 23:58:35.770516 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Apr 24 23:58:35.770539 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Apr 24 23:58:35.770564 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Apr 24 23:58:35.770587 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Apr 24 23:58:35.770610 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Apr 24 23:58:35.770632 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Apr 24 23:58:35.770655 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Apr 24 23:58:35.770681 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Apr 24 23:58:35.770704 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Apr 24 23:58:35.770726 systemd[1]: Created slice user.slice - User and Session Slice. Apr 24 23:58:35.770744 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 24 23:58:35.770764 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 24 23:58:35.774596 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Apr 24 23:58:35.774644 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Apr 24 23:58:35.774671 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Apr 24 23:58:35.774709 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 24 23:58:35.774736 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Apr 24 23:58:35.774760 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 24 23:58:35.774794 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Apr 24 23:58:35.775563 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Apr 24 23:58:35.775592 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Apr 24 23:58:35.775617 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Apr 24 23:58:35.775644 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 24 23:58:35.775675 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 24 23:58:35.775701 systemd[1]: Reached target slices.target - Slice Units. Apr 24 23:58:35.775725 systemd[1]: Reached target swap.target - Swaps. Apr 24 23:58:35.775749 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Apr 24 23:58:35.775775 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Apr 24 23:58:35.776432 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 24 23:58:35.776463 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 24 23:58:35.776490 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 24 23:58:35.776514 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Apr 24 23:58:35.776547 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Apr 24 23:58:35.776573 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Apr 24 23:58:35.776598 systemd[1]: Mounting media.mount - External Media Directory... Apr 24 23:58:35.776623 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 24 23:58:35.776648 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Apr 24 23:58:35.776673 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Apr 24 23:58:35.776700 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Apr 24 23:58:35.776728 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Apr 24 23:58:35.776751 systemd[1]: Reached target machines.target - Containers. Apr 24 23:58:35.776781 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Apr 24 23:58:35.778885 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 24 23:58:35.778917 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 24 23:58:35.778944 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Apr 24 23:58:35.778969 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 24 23:58:35.778995 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 24 23:58:35.779028 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 24 23:58:35.779053 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Apr 24 23:58:35.779083 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 24 23:58:35.779111 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Apr 24 23:58:35.779136 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Apr 24 23:58:35.779161 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Apr 24 23:58:35.779186 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Apr 24 23:58:35.779210 systemd[1]: Stopped systemd-fsck-usr.service. Apr 24 23:58:35.779235 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 24 23:58:35.779260 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 24 23:58:35.779286 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Apr 24 23:58:35.779316 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Apr 24 23:58:35.779340 kernel: fuse: init (API version 7.39) Apr 24 23:58:35.779365 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 24 23:58:35.779390 systemd[1]: verity-setup.service: Deactivated successfully. Apr 24 23:58:35.779417 systemd[1]: Stopped verity-setup.service. Apr 24 23:58:35.779443 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 24 23:58:35.779468 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Apr 24 23:58:35.779493 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Apr 24 23:58:35.779517 systemd[1]: Mounted media.mount - External Media Directory. Apr 24 23:58:35.779547 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Apr 24 23:58:35.779572 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Apr 24 23:58:35.779597 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Apr 24 23:58:35.779621 kernel: loop: module loaded Apr 24 23:58:35.779649 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Apr 24 23:58:35.779674 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 24 23:58:35.779705 systemd[1]: modprobe@configfs.service: Deactivated successfully. Apr 24 23:58:35.779739 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Apr 24 23:58:35.779765 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 24 23:58:35.780707 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 24 23:58:35.780746 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 24 23:58:35.780774 kernel: ACPI: bus type drm_connector registered Apr 24 23:58:35.780812 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 24 23:58:35.780846 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 24 23:58:35.780866 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 24 23:58:35.780919 systemd-journald[1538]: Collecting audit messages is disabled. Apr 24 23:58:35.780968 systemd[1]: modprobe@fuse.service: Deactivated successfully. Apr 24 23:58:35.780998 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Apr 24 23:58:35.781025 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 24 23:58:35.781050 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 24 23:58:35.781077 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 24 23:58:35.781103 systemd-journald[1538]: Journal started Apr 24 23:58:35.781151 systemd-journald[1538]: Runtime Journal (/run/log/journal/ec2179847255351668174b73631b2a2c) is 4.7M, max 38.2M, 33.4M free. Apr 24 23:58:35.342574 systemd[1]: Queued start job for default target multi-user.target. Apr 24 23:58:35.384601 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. Apr 24 23:58:35.385027 systemd[1]: systemd-journald.service: Deactivated successfully. Apr 24 23:58:35.786836 systemd[1]: Started systemd-journald.service - Journal Service. Apr 24 23:58:35.786388 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Apr 24 23:58:35.789132 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Apr 24 23:58:35.807403 systemd[1]: Reached target network-pre.target - Preparation for Network. Apr 24 23:58:35.815939 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Apr 24 23:58:35.828929 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Apr 24 23:58:35.831687 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Apr 24 23:58:35.831755 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 24 23:58:35.836317 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Apr 24 23:58:35.850986 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Apr 24 23:58:35.858977 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Apr 24 23:58:35.859642 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 24 23:58:35.863077 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Apr 24 23:58:35.868188 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Apr 24 23:58:35.868930 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 24 23:58:35.876101 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Apr 24 23:58:35.877073 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 24 23:58:35.879898 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 24 23:58:35.897026 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Apr 24 23:58:35.900998 systemd[1]: Starting systemd-sysusers.service - Create System Users... Apr 24 23:58:35.905189 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 24 23:58:35.906259 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Apr 24 23:58:35.907253 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Apr 24 23:58:35.909171 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Apr 24 23:58:35.910229 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Apr 24 23:58:35.914545 systemd-journald[1538]: Time spent on flushing to /var/log/journal/ec2179847255351668174b73631b2a2c is 129.557ms for 981 entries. Apr 24 23:58:35.914545 systemd-journald[1538]: System Journal (/var/log/journal/ec2179847255351668174b73631b2a2c) is 8.0M, max 195.6M, 187.6M free. Apr 24 23:58:36.061089 systemd-journald[1538]: Received client request to flush runtime journal. Apr 24 23:58:36.065196 kernel: loop0: detected capacity change from 0 to 142488 Apr 24 23:58:36.065252 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Apr 24 23:58:35.914496 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Apr 24 23:58:35.925646 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Apr 24 23:58:35.931039 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Apr 24 23:58:35.997675 udevadm[1583]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Apr 24 23:58:36.034532 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 24 23:58:36.074520 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Apr 24 23:58:36.077418 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Apr 24 23:58:36.083410 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Apr 24 23:58:36.099329 systemd[1]: Finished systemd-sysusers.service - Create System Users. Apr 24 23:58:36.106039 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 24 23:58:36.109810 kernel: loop1: detected capacity change from 0 to 61336 Apr 24 23:58:36.157049 systemd-tmpfiles[1597]: ACLs are not supported, ignoring. Apr 24 23:58:36.158023 systemd-tmpfiles[1597]: ACLs are not supported, ignoring. Apr 24 23:58:36.172306 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 24 23:58:36.210965 kernel: loop2: detected capacity change from 0 to 140768 Apr 24 23:58:36.312817 kernel: loop3: detected capacity change from 0 to 217752 Apr 24 23:58:36.434827 kernel: loop4: detected capacity change from 0 to 142488 Apr 24 23:58:36.490094 kernel: loop5: detected capacity change from 0 to 61336 Apr 24 23:58:36.511828 kernel: loop6: detected capacity change from 0 to 140768 Apr 24 23:58:36.547817 kernel: loop7: detected capacity change from 0 to 217752 Apr 24 23:58:36.574071 (sd-merge)[1603]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-ami'. Apr 24 23:58:36.574781 (sd-merge)[1603]: Merged extensions into '/usr'. Apr 24 23:58:36.585900 systemd[1]: Reloading requested from client PID 1575 ('systemd-sysext') (unit systemd-sysext.service)... Apr 24 23:58:36.586109 systemd[1]: Reloading... Apr 24 23:58:36.708818 zram_generator::config[1629]: No configuration found. Apr 24 23:58:36.909864 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:58:36.984974 systemd[1]: Reloading finished in 398 ms. Apr 24 23:58:37.014276 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Apr 24 23:58:37.015334 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Apr 24 23:58:37.029019 systemd[1]: Starting ensure-sysext.service... Apr 24 23:58:37.035877 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 24 23:58:37.039246 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 24 23:58:37.055975 systemd[1]: Reloading requested from client PID 1681 ('systemctl') (unit ensure-sysext.service)... Apr 24 23:58:37.055994 systemd[1]: Reloading... Apr 24 23:58:37.068404 systemd-tmpfiles[1682]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Apr 24 23:58:37.069417 systemd-tmpfiles[1682]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Apr 24 23:58:37.071000 systemd-tmpfiles[1682]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Apr 24 23:58:37.071581 systemd-tmpfiles[1682]: ACLs are not supported, ignoring. Apr 24 23:58:37.071833 systemd-tmpfiles[1682]: ACLs are not supported, ignoring. Apr 24 23:58:37.081653 systemd-tmpfiles[1682]: Detected autofs mount point /boot during canonicalization of boot. Apr 24 23:58:37.081673 systemd-tmpfiles[1682]: Skipping /boot Apr 24 23:58:37.111119 systemd-tmpfiles[1682]: Detected autofs mount point /boot during canonicalization of boot. Apr 24 23:58:37.111139 systemd-tmpfiles[1682]: Skipping /boot Apr 24 23:58:37.118095 systemd-udevd[1683]: Using default interface naming scheme 'v255'. Apr 24 23:58:37.194918 zram_generator::config[1713]: No configuration found. Apr 24 23:58:37.302606 (udev-worker)[1724]: Network interface NamePolicy= disabled on kernel command line. Apr 24 23:58:37.459882 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 31 scanned by (udev-worker) (1715) Apr 24 23:58:37.478809 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Apr 24 23:58:37.488812 kernel: ACPI: button: Power Button [PWRF] Apr 24 23:58:37.492806 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSLPBN:00/input/input3 Apr 24 23:58:37.498872 kernel: ACPI: button: Sleep Button [SLPF] Apr 24 23:58:37.509019 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:58:37.534819 ldconfig[1570]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Apr 24 23:58:37.557809 kernel: piix4_smbus 0000:00:01.3: SMBus base address uninitialized - upgrade BIOS or use force_addr=0xaddr Apr 24 23:58:37.601838 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input4 Apr 24 23:58:37.670312 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Apr 24 23:58:37.671036 systemd[1]: Reloading finished in 614 ms. Apr 24 23:58:37.690731 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 24 23:58:37.693224 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Apr 24 23:58:37.696171 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 24 23:58:37.768809 kernel: mousedev: PS/2 mouse device common for all mice Apr 24 23:58:37.800065 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Apr 24 23:58:37.807435 systemd[1]: Finished ensure-sysext.service. Apr 24 23:58:37.814804 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Amazon Elastic Block Store OEM. Apr 24 23:58:37.815506 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 24 23:58:37.824121 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 24 23:58:37.832192 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Apr 24 23:58:37.833113 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 24 23:58:37.835025 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Apr 24 23:58:37.838383 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 24 23:58:37.843104 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 24 23:58:37.858989 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 24 23:58:37.865127 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 24 23:58:37.866094 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 24 23:58:37.868024 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Apr 24 23:58:37.875286 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Apr 24 23:58:37.881580 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 24 23:58:37.893180 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 24 23:58:37.893917 systemd[1]: Reached target time-set.target - System Time Set. Apr 24 23:58:37.901193 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Apr 24 23:58:37.905144 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:58:37.908813 lvm[1878]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 24 23:58:37.906889 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 24 23:58:37.908936 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 24 23:58:37.910885 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 24 23:58:37.911962 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 24 23:58:37.912175 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 24 23:58:37.913589 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 24 23:58:37.914839 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 24 23:58:37.924265 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 24 23:58:37.944031 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 24 23:58:37.944278 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 24 23:58:37.945419 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 24 23:58:37.951952 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Apr 24 23:58:37.953095 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 24 23:58:37.958943 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Apr 24 23:58:37.963018 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Apr 24 23:58:37.990039 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Apr 24 23:58:38.000695 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Apr 24 23:58:38.014813 lvm[1905]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 24 23:58:38.054564 augenrules[1915]: No rules Apr 24 23:58:38.058111 systemd[1]: Started systemd-userdbd.service - User Database Manager. Apr 24 23:58:38.061840 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 24 23:58:38.062963 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Apr 24 23:58:38.066538 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Apr 24 23:58:38.075079 systemd[1]: Starting systemd-update-done.service - Update is Completed... Apr 24 23:58:38.094275 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Apr 24 23:58:38.095802 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Apr 24 23:58:38.109891 systemd[1]: Finished systemd-update-done.service - Update is Completed. Apr 24 23:58:38.141851 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:58:38.181636 systemd-networkd[1892]: lo: Link UP Apr 24 23:58:38.181653 systemd-networkd[1892]: lo: Gained carrier Apr 24 23:58:38.183661 systemd-networkd[1892]: Enumeration completed Apr 24 23:58:38.183841 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 24 23:58:38.185823 systemd-networkd[1892]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:58:38.186210 systemd-resolved[1893]: Positive Trust Anchors: Apr 24 23:58:38.186522 systemd-resolved[1893]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 24 23:58:38.186645 systemd-resolved[1893]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 24 23:58:38.186771 systemd-networkd[1892]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 24 23:58:38.189600 systemd-networkd[1892]: eth0: Link UP Apr 24 23:58:38.189876 systemd-networkd[1892]: eth0: Gained carrier Apr 24 23:58:38.189909 systemd-networkd[1892]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:58:38.192302 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Apr 24 23:58:38.197404 systemd-resolved[1893]: Defaulting to hostname 'linux'. Apr 24 23:58:38.199773 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 24 23:58:38.199853 systemd-networkd[1892]: eth0: DHCPv4 address 172.31.31.110/20, gateway 172.31.16.1 acquired from 172.31.16.1 Apr 24 23:58:38.200500 systemd[1]: Reached target network.target - Network. Apr 24 23:58:38.201094 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 24 23:58:38.201646 systemd[1]: Reached target sysinit.target - System Initialization. Apr 24 23:58:38.203318 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Apr 24 23:58:38.203916 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Apr 24 23:58:38.204622 systemd[1]: Started logrotate.timer - Daily rotation of log files. Apr 24 23:58:38.205291 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Apr 24 23:58:38.205855 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Apr 24 23:58:38.206293 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Apr 24 23:58:38.206340 systemd[1]: Reached target paths.target - Path Units. Apr 24 23:58:38.206706 systemd[1]: Reached target timers.target - Timer Units. Apr 24 23:58:38.208137 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Apr 24 23:58:38.210005 systemd[1]: Starting docker.socket - Docker Socket for the API... Apr 24 23:58:38.222007 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Apr 24 23:58:38.223258 systemd[1]: Listening on docker.socket - Docker Socket for the API. Apr 24 23:58:38.223837 systemd[1]: Reached target sockets.target - Socket Units. Apr 24 23:58:38.224244 systemd[1]: Reached target basic.target - Basic System. Apr 24 23:58:38.224678 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Apr 24 23:58:38.224721 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Apr 24 23:58:38.225876 systemd[1]: Starting containerd.service - containerd container runtime... Apr 24 23:58:38.229975 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Apr 24 23:58:38.237008 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Apr 24 23:58:38.238769 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Apr 24 23:58:38.253221 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Apr 24 23:58:38.254110 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Apr 24 23:58:38.265547 jq[1942]: false Apr 24 23:58:38.267063 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Apr 24 23:58:38.280494 systemd[1]: Started ntpd.service - Network Time Service. Apr 24 23:58:38.295420 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Apr 24 23:58:38.298628 systemd[1]: Starting setup-oem.service - Setup OEM... Apr 24 23:58:38.307343 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Apr 24 23:58:38.319993 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Apr 24 23:58:38.321858 extend-filesystems[1943]: Found loop4 Apr 24 23:58:38.324295 extend-filesystems[1943]: Found loop5 Apr 24 23:58:38.325062 extend-filesystems[1943]: Found loop6 Apr 24 23:58:38.325663 extend-filesystems[1943]: Found loop7 Apr 24 23:58:38.326231 extend-filesystems[1943]: Found nvme0n1 Apr 24 23:58:38.327338 extend-filesystems[1943]: Found nvme0n1p1 Apr 24 23:58:38.327338 extend-filesystems[1943]: Found nvme0n1p2 Apr 24 23:58:38.327338 extend-filesystems[1943]: Found nvme0n1p3 Apr 24 23:58:38.327338 extend-filesystems[1943]: Found usr Apr 24 23:58:38.327338 extend-filesystems[1943]: Found nvme0n1p4 Apr 24 23:58:38.327338 extend-filesystems[1943]: Found nvme0n1p6 Apr 24 23:58:38.327338 extend-filesystems[1943]: Found nvme0n1p7 Apr 24 23:58:38.327338 extend-filesystems[1943]: Found nvme0n1p9 Apr 24 23:58:38.327338 extend-filesystems[1943]: Checking size of /dev/nvme0n1p9 Apr 24 23:58:38.332011 systemd[1]: Starting systemd-logind.service - User Login Management... Apr 24 23:58:38.339327 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Apr 24 23:58:38.341114 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Apr 24 23:58:38.350311 systemd[1]: Starting update-engine.service - Update Engine... Apr 24 23:58:38.353316 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Apr 24 23:58:38.364150 ntpd[1945]: ntpd 4.2.8p17@1.4004-o Fri Apr 24 21:46:02 UTC 2026 (1): Starting Apr 24 23:58:38.364559 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Apr 24 23:58:38.365955 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: ntpd 4.2.8p17@1.4004-o Fri Apr 24 21:46:02 UTC 2026 (1): Starting Apr 24 23:58:38.365955 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Apr 24 23:58:38.365955 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: ---------------------------------------------------- Apr 24 23:58:38.365955 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: ntp-4 is maintained by Network Time Foundation, Apr 24 23:58:38.365955 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Apr 24 23:58:38.365955 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: corporation. Support and training for ntp-4 are Apr 24 23:58:38.365955 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: available at https://www.nwtime.org/support Apr 24 23:58:38.365955 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: ---------------------------------------------------- Apr 24 23:58:38.364180 ntpd[1945]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Apr 24 23:58:38.364817 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Apr 24 23:58:38.364191 ntpd[1945]: ---------------------------------------------------- Apr 24 23:58:38.364201 ntpd[1945]: ntp-4 is maintained by Network Time Foundation, Apr 24 23:58:38.364212 ntpd[1945]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Apr 24 23:58:38.364222 ntpd[1945]: corporation. Support and training for ntp-4 are Apr 24 23:58:38.364232 ntpd[1945]: available at https://www.nwtime.org/support Apr 24 23:58:38.364243 ntpd[1945]: ---------------------------------------------------- Apr 24 23:58:38.371047 ntpd[1945]: proto: precision = 0.068 usec (-24) Apr 24 23:58:38.373940 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: proto: precision = 0.068 usec (-24) Apr 24 23:58:38.373940 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: basedate set to 2026-04-12 Apr 24 23:58:38.373940 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: gps base set to 2026-04-12 (week 2414) Apr 24 23:58:38.372078 ntpd[1945]: basedate set to 2026-04-12 Apr 24 23:58:38.372099 ntpd[1945]: gps base set to 2026-04-12 (week 2414) Apr 24 23:58:38.378843 ntpd[1945]: Listen and drop on 0 v6wildcard [::]:123 Apr 24 23:58:38.378906 ntpd[1945]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Apr 24 23:58:38.379049 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: Listen and drop on 0 v6wildcard [::]:123 Apr 24 23:58:38.379049 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Apr 24 23:58:38.381309 ntpd[1945]: Listen normally on 2 lo 127.0.0.1:123 Apr 24 23:58:38.381366 ntpd[1945]: Listen normally on 3 eth0 172.31.31.110:123 Apr 24 23:58:38.381475 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: Listen normally on 2 lo 127.0.0.1:123 Apr 24 23:58:38.381475 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: Listen normally on 3 eth0 172.31.31.110:123 Apr 24 23:58:38.381475 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: Listen normally on 4 lo [::1]:123 Apr 24 23:58:38.381475 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: bind(21) AF_INET6 fe80::4d8:cfff:fe33:ccdd%2#123 flags 0x11 failed: Cannot assign requested address Apr 24 23:58:38.381409 ntpd[1945]: Listen normally on 4 lo [::1]:123 Apr 24 23:58:38.381699 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: unable to create socket on eth0 (5) for fe80::4d8:cfff:fe33:ccdd%2#123 Apr 24 23:58:38.381699 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: failed to init interface for address fe80::4d8:cfff:fe33:ccdd%2 Apr 24 23:58:38.381699 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: Listening on routing socket on fd #21 for interface updates Apr 24 23:58:38.381463 ntpd[1945]: bind(21) AF_INET6 fe80::4d8:cfff:fe33:ccdd%2#123 flags 0x11 failed: Cannot assign requested address Apr 24 23:58:38.381485 ntpd[1945]: unable to create socket on eth0 (5) for fe80::4d8:cfff:fe33:ccdd%2#123 Apr 24 23:58:38.381501 ntpd[1945]: failed to init interface for address fe80::4d8:cfff:fe33:ccdd%2 Apr 24 23:58:38.381536 ntpd[1945]: Listening on routing socket on fd #21 for interface updates Apr 24 23:58:38.399451 ntpd[1945]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Apr 24 23:58:38.399948 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Apr 24 23:58:38.399948 ntpd[1945]: 24 Apr 23:58:38 ntpd[1945]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Apr 24 23:58:38.399492 ntpd[1945]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Apr 24 23:58:38.430853 update_engine[1956]: I20260424 23:58:38.425104 1956 main.cc:92] Flatcar Update Engine starting Apr 24 23:58:38.426255 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Apr 24 23:58:38.435928 dbus-daemon[1941]: [system] SELinux support is enabled Apr 24 23:58:38.426974 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Apr 24 23:58:38.436871 systemd[1]: Started dbus.service - D-Bus System Message Bus. Apr 24 23:58:38.437241 (ntainerd)[1974]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Apr 24 23:58:38.456665 jq[1958]: true Apr 24 23:58:38.440688 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Apr 24 23:58:38.440730 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Apr 24 23:58:38.442457 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Apr 24 23:58:38.442486 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Apr 24 23:58:38.463159 dbus-daemon[1941]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1892 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Apr 24 23:58:38.470082 update_engine[1956]: I20260424 23:58:38.469868 1956 update_check_scheduler.cc:74] Next update check in 10m20s Apr 24 23:58:38.477973 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Apr 24 23:58:38.482811 extend-filesystems[1943]: Resized partition /dev/nvme0n1p9 Apr 24 23:58:38.484014 systemd[1]: Started update-engine.service - Update Engine. Apr 24 23:58:38.493050 systemd[1]: Started locksmithd.service - Cluster reboot manager. Apr 24 23:58:38.500470 extend-filesystems[1985]: resize2fs 1.47.1 (20-May-2024) Apr 24 23:58:38.510807 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 3587067 blocks Apr 24 23:58:38.512943 tar[1960]: linux-amd64/LICENSE Apr 24 23:58:38.514053 systemd[1]: motdgen.service: Deactivated successfully. Apr 24 23:58:38.516518 tar[1960]: linux-amd64/helm Apr 24 23:58:38.514317 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Apr 24 23:58:38.530115 jq[1980]: true Apr 24 23:58:38.580147 systemd[1]: Finished setup-oem.service - Setup OEM. Apr 24 23:58:38.600835 systemd-logind[1954]: Watching system buttons on /dev/input/event1 (Power Button) Apr 24 23:58:38.600868 systemd-logind[1954]: Watching system buttons on /dev/input/event2 (Sleep Button) Apr 24 23:58:38.600893 systemd-logind[1954]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Apr 24 23:58:38.601619 systemd-logind[1954]: New seat seat0. Apr 24 23:58:38.604901 systemd[1]: Started systemd-logind.service - User Login Management. Apr 24 23:58:38.691806 coreos-metadata[1940]: Apr 24 23:58:38.688 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Apr 24 23:58:38.693656 coreos-metadata[1940]: Apr 24 23:58:38.692 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-id: Attempt #1 Apr 24 23:58:38.695259 coreos-metadata[1940]: Apr 24 23:58:38.694 INFO Fetch successful Apr 24 23:58:38.695259 coreos-metadata[1940]: Apr 24 23:58:38.695 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/instance-type: Attempt #1 Apr 24 23:58:38.697046 coreos-metadata[1940]: Apr 24 23:58:38.697 INFO Fetch successful Apr 24 23:58:38.697136 coreos-metadata[1940]: Apr 24 23:58:38.697 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/local-ipv4: Attempt #1 Apr 24 23:58:38.698753 coreos-metadata[1940]: Apr 24 23:58:38.698 INFO Fetch successful Apr 24 23:58:38.698753 coreos-metadata[1940]: Apr 24 23:58:38.698 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-ipv4: Attempt #1 Apr 24 23:58:38.698911 coreos-metadata[1940]: Apr 24 23:58:38.698 INFO Fetch successful Apr 24 23:58:38.698911 coreos-metadata[1940]: Apr 24 23:58:38.698 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/ipv6: Attempt #1 Apr 24 23:58:38.699382 coreos-metadata[1940]: Apr 24 23:58:38.699 INFO Fetch failed with 404: resource not found Apr 24 23:58:38.699382 coreos-metadata[1940]: Apr 24 23:58:38.699 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone: Attempt #1 Apr 24 23:58:38.701025 coreos-metadata[1940]: Apr 24 23:58:38.700 INFO Fetch successful Apr 24 23:58:38.701025 coreos-metadata[1940]: Apr 24 23:58:38.700 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/placement/availability-zone-id: Attempt #1 Apr 24 23:58:38.705819 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 3587067 Apr 24 23:58:38.711824 coreos-metadata[1940]: Apr 24 23:58:38.705 INFO Fetch successful Apr 24 23:58:38.711824 coreos-metadata[1940]: Apr 24 23:58:38.705 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/hostname: Attempt #1 Apr 24 23:58:38.711824 coreos-metadata[1940]: Apr 24 23:58:38.706 INFO Fetch successful Apr 24 23:58:38.711824 coreos-metadata[1940]: Apr 24 23:58:38.706 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-hostname: Attempt #1 Apr 24 23:58:38.711824 coreos-metadata[1940]: Apr 24 23:58:38.707 INFO Fetch successful Apr 24 23:58:38.711824 coreos-metadata[1940]: Apr 24 23:58:38.707 INFO Fetching http://169.254.169.254/2021-01-03/dynamic/instance-identity/document: Attempt #1 Apr 24 23:58:38.711824 coreos-metadata[1940]: Apr 24 23:58:38.707 INFO Fetch successful Apr 24 23:58:38.723732 dbus-daemon[1941]: [system] Successfully activated service 'org.freedesktop.hostname1' Apr 24 23:58:38.724406 dbus-daemon[1941]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.5' (uid=0 pid=1981 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Apr 24 23:58:38.724869 extend-filesystems[1985]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Apr 24 23:58:38.724869 extend-filesystems[1985]: old_desc_blocks = 1, new_desc_blocks = 2 Apr 24 23:58:38.724869 extend-filesystems[1985]: The filesystem on /dev/nvme0n1p9 is now 3587067 (4k) blocks long. Apr 24 23:58:38.731093 extend-filesystems[1943]: Resized filesystem in /dev/nvme0n1p9 Apr 24 23:58:38.725069 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Apr 24 23:58:38.735655 systemd[1]: extend-filesystems.service: Deactivated successfully. Apr 24 23:58:38.738170 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Apr 24 23:58:38.754842 systemd[1]: Starting polkit.service - Authorization Manager... Apr 24 23:58:38.776516 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 31 scanned by (udev-worker) (1722) Apr 24 23:58:38.812381 bash[2021]: Updated "/home/core/.ssh/authorized_keys" Apr 24 23:58:38.821885 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Apr 24 23:58:38.836918 systemd[1]: Starting sshkeys.service... Apr 24 23:58:38.869943 polkitd[2020]: Started polkitd version 121 Apr 24 23:58:38.874136 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Apr 24 23:58:38.875337 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Apr 24 23:58:38.881661 sshd_keygen[1991]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Apr 24 23:58:38.909907 polkitd[2020]: Loading rules from directory /etc/polkit-1/rules.d Apr 24 23:58:38.910005 polkitd[2020]: Loading rules from directory /usr/share/polkit-1/rules.d Apr 24 23:58:38.912510 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Apr 24 23:58:38.919362 polkitd[2020]: Finished loading, compiling and executing 2 rules Apr 24 23:58:38.920167 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Apr 24 23:58:38.925133 dbus-daemon[1941]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Apr 24 23:58:38.925294 systemd[1]: Started polkit.service - Authorization Manager. Apr 24 23:58:38.926760 polkitd[2020]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Apr 24 23:58:38.972222 systemd-hostnamed[1981]: Hostname set to (transient) Apr 24 23:58:38.972366 systemd-resolved[1893]: System hostname changed to 'ip-172-31-31-110'. Apr 24 23:58:39.016353 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Apr 24 23:58:39.037123 systemd[1]: Starting issuegen.service - Generate /run/issue... Apr 24 23:58:39.042539 locksmithd[1986]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Apr 24 23:58:39.062734 systemd[1]: issuegen.service: Deactivated successfully. Apr 24 23:58:39.063019 systemd[1]: Finished issuegen.service - Generate /run/issue. Apr 24 23:58:39.074750 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Apr 24 23:58:39.141455 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Apr 24 23:58:39.152448 systemd[1]: Started getty@tty1.service - Getty on tty1. Apr 24 23:58:39.163352 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Apr 24 23:58:39.164796 systemd[1]: Reached target getty.target - Login Prompts. Apr 24 23:58:39.209035 coreos-metadata[2068]: Apr 24 23:58:39.208 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Apr 24 23:58:39.216808 coreos-metadata[2068]: Apr 24 23:58:39.216 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys: Attempt #1 Apr 24 23:58:39.220540 coreos-metadata[2068]: Apr 24 23:58:39.220 INFO Fetch successful Apr 24 23:58:39.220664 coreos-metadata[2068]: Apr 24 23:58:39.220 INFO Fetching http://169.254.169.254/2021-01-03/meta-data/public-keys/0/openssh-key: Attempt #1 Apr 24 23:58:39.226167 coreos-metadata[2068]: Apr 24 23:58:39.226 INFO Fetch successful Apr 24 23:58:39.228887 unknown[2068]: wrote ssh authorized keys file for user: core Apr 24 23:58:39.304480 update-ssh-keys[2148]: Updated "/home/core/.ssh/authorized_keys" Apr 24 23:58:39.306722 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Apr 24 23:58:39.310931 systemd[1]: Finished sshkeys.service. Apr 24 23:58:39.350635 containerd[1974]: time="2026-04-24T23:58:39.350533557Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Apr 24 23:58:39.365062 ntpd[1945]: bind(24) AF_INET6 fe80::4d8:cfff:fe33:ccdd%2#123 flags 0x11 failed: Cannot assign requested address Apr 24 23:58:39.365113 ntpd[1945]: unable to create socket on eth0 (6) for fe80::4d8:cfff:fe33:ccdd%2#123 Apr 24 23:58:39.365491 ntpd[1945]: 24 Apr 23:58:39 ntpd[1945]: bind(24) AF_INET6 fe80::4d8:cfff:fe33:ccdd%2#123 flags 0x11 failed: Cannot assign requested address Apr 24 23:58:39.365491 ntpd[1945]: 24 Apr 23:58:39 ntpd[1945]: unable to create socket on eth0 (6) for fe80::4d8:cfff:fe33:ccdd%2#123 Apr 24 23:58:39.365491 ntpd[1945]: 24 Apr 23:58:39 ntpd[1945]: failed to init interface for address fe80::4d8:cfff:fe33:ccdd%2 Apr 24 23:58:39.365127 ntpd[1945]: failed to init interface for address fe80::4d8:cfff:fe33:ccdd%2 Apr 24 23:58:39.398894 containerd[1974]: time="2026-04-24T23:58:39.398754931Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:58:39.400888 containerd[1974]: time="2026-04-24T23:58:39.400841240Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:58:39.400888 containerd[1974]: time="2026-04-24T23:58:39.400886842Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Apr 24 23:58:39.401010 containerd[1974]: time="2026-04-24T23:58:39.400908598Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Apr 24 23:58:39.401119 containerd[1974]: time="2026-04-24T23:58:39.401094409Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Apr 24 23:58:39.401177 containerd[1974]: time="2026-04-24T23:58:39.401127587Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Apr 24 23:58:39.401234 containerd[1974]: time="2026-04-24T23:58:39.401211243Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:58:39.401274 containerd[1974]: time="2026-04-24T23:58:39.401238847Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:58:39.401513 containerd[1974]: time="2026-04-24T23:58:39.401483547Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:58:39.401572 containerd[1974]: time="2026-04-24T23:58:39.401513823Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Apr 24 23:58:39.401572 containerd[1974]: time="2026-04-24T23:58:39.401535281Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:58:39.401572 containerd[1974]: time="2026-04-24T23:58:39.401550665Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Apr 24 23:58:39.401686 containerd[1974]: time="2026-04-24T23:58:39.401661340Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:58:39.402507 containerd[1974]: time="2026-04-24T23:58:39.401963732Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:58:39.402507 containerd[1974]: time="2026-04-24T23:58:39.402135232Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:58:39.402507 containerd[1974]: time="2026-04-24T23:58:39.402159025Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Apr 24 23:58:39.402507 containerd[1974]: time="2026-04-24T23:58:39.402257186Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Apr 24 23:58:39.402507 containerd[1974]: time="2026-04-24T23:58:39.402312225Z" level=info msg="metadata content store policy set" policy=shared Apr 24 23:58:39.407595 containerd[1974]: time="2026-04-24T23:58:39.407559249Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Apr 24 23:58:39.407747 containerd[1974]: time="2026-04-24T23:58:39.407728818Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.407829316Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.407856653Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.407881119Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.408044250Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.408393048Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.408522688Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.408548230Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.408567292Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.408591122Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.408611800Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.408630849Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.408651793Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.408675037Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Apr 24 23:58:39.409804 containerd[1974]: time="2026-04-24T23:58:39.408695401Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408713835Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408731770Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408764110Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408804311Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408825791Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408854785Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408875767Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408896895Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408914688Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408934765Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408954033Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408975633Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.408994611Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410339 containerd[1974]: time="2026-04-24T23:58:39.409013609Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409032368Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409059886Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409088761Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409109569Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409127760Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409188773Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409219594Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409237886Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409255756Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409269929Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409297146Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409310972Z" level=info msg="NRI interface is disabled by configuration." Apr 24 23:58:39.410863 containerd[1974]: time="2026-04-24T23:58:39.409327443Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Apr 24 23:58:39.411390 containerd[1974]: time="2026-04-24T23:58:39.409729031Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Apr 24 23:58:39.411651 containerd[1974]: time="2026-04-24T23:58:39.411631276Z" level=info msg="Connect containerd service" Apr 24 23:58:39.411841 containerd[1974]: time="2026-04-24T23:58:39.411822893Z" level=info msg="using legacy CRI server" Apr 24 23:58:39.411916 containerd[1974]: time="2026-04-24T23:58:39.411902886Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Apr 24 23:58:39.412127 containerd[1974]: time="2026-04-24T23:58:39.412109165Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Apr 24 23:58:39.413217 containerd[1974]: time="2026-04-24T23:58:39.413174262Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 24 23:58:39.413716 containerd[1974]: time="2026-04-24T23:58:39.413640135Z" level=info msg="Start subscribing containerd event" Apr 24 23:58:39.413778 containerd[1974]: time="2026-04-24T23:58:39.413722927Z" level=info msg="Start recovering state" Apr 24 23:58:39.413844 containerd[1974]: time="2026-04-24T23:58:39.413821559Z" level=info msg="Start event monitor" Apr 24 23:58:39.413884 containerd[1974]: time="2026-04-24T23:58:39.413844882Z" level=info msg="Start snapshots syncer" Apr 24 23:58:39.413884 containerd[1974]: time="2026-04-24T23:58:39.413858623Z" level=info msg="Start cni network conf syncer for default" Apr 24 23:58:39.413884 containerd[1974]: time="2026-04-24T23:58:39.413869679Z" level=info msg="Start streaming server" Apr 24 23:58:39.414170 containerd[1974]: time="2026-04-24T23:58:39.414149250Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Apr 24 23:58:39.414310 containerd[1974]: time="2026-04-24T23:58:39.414294998Z" level=info msg=serving... address=/run/containerd/containerd.sock Apr 24 23:58:39.414542 systemd[1]: Started containerd.service - containerd container runtime. Apr 24 23:58:39.418438 containerd[1974]: time="2026-04-24T23:58:39.418396913Z" level=info msg="containerd successfully booted in 0.068946s" Apr 24 23:58:39.643947 tar[1960]: linux-amd64/README.md Apr 24 23:58:39.656000 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Apr 24 23:58:40.097005 systemd-networkd[1892]: eth0: Gained IPv6LL Apr 24 23:58:40.100522 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Apr 24 23:58:40.101974 systemd[1]: Reached target network-online.target - Network is Online. Apr 24 23:58:40.109680 systemd[1]: Started amazon-ssm-agent.service - amazon-ssm-agent. Apr 24 23:58:40.114890 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:58:40.118100 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Apr 24 23:58:40.163004 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Apr 24 23:58:40.182845 amazon-ssm-agent[2165]: Initializing new seelog logger Apr 24 23:58:40.183346 amazon-ssm-agent[2165]: New Seelog Logger Creation Complete Apr 24 23:58:40.183346 amazon-ssm-agent[2165]: 2026/04/24 23:58:40 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Apr 24 23:58:40.183346 amazon-ssm-agent[2165]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Apr 24 23:58:40.185528 amazon-ssm-agent[2165]: 2026/04/24 23:58:40 processing appconfig overrides Apr 24 23:58:40.185528 amazon-ssm-agent[2165]: 2026/04/24 23:58:40 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Apr 24 23:58:40.185528 amazon-ssm-agent[2165]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Apr 24 23:58:40.185528 amazon-ssm-agent[2165]: 2026/04/24 23:58:40 processing appconfig overrides Apr 24 23:58:40.185528 amazon-ssm-agent[2165]: 2026/04/24 23:58:40 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Apr 24 23:58:40.185528 amazon-ssm-agent[2165]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Apr 24 23:58:40.185528 amazon-ssm-agent[2165]: 2026/04/24 23:58:40 processing appconfig overrides Apr 24 23:58:40.185528 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO Proxy environment variables: Apr 24 23:58:40.188808 amazon-ssm-agent[2165]: 2026/04/24 23:58:40 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Apr 24 23:58:40.188808 amazon-ssm-agent[2165]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Apr 24 23:58:40.188808 amazon-ssm-agent[2165]: 2026/04/24 23:58:40 processing appconfig overrides Apr 24 23:58:40.284144 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO https_proxy: Apr 24 23:58:40.381944 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO http_proxy: Apr 24 23:58:40.480041 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO no_proxy: Apr 24 23:58:40.578322 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO Checking if agent identity type OnPrem can be assumed Apr 24 23:58:40.676976 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO Checking if agent identity type EC2 can be assumed Apr 24 23:58:40.708160 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO Agent will take identity from EC2 Apr 24 23:58:40.708160 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [amazon-ssm-agent] using named pipe channel for IPC Apr 24 23:58:40.708160 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [amazon-ssm-agent] using named pipe channel for IPC Apr 24 23:58:40.708160 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [amazon-ssm-agent] using named pipe channel for IPC Apr 24 23:58:40.708160 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [amazon-ssm-agent] amazon-ssm-agent - v3.2.0.0 Apr 24 23:58:40.708421 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [amazon-ssm-agent] OS: linux, Arch: amd64 Apr 24 23:58:40.708421 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [amazon-ssm-agent] Starting Core Agent Apr 24 23:58:40.708421 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [amazon-ssm-agent] registrar detected. Attempting registration Apr 24 23:58:40.708421 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [Registrar] Starting registrar module Apr 24 23:58:40.708421 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [EC2Identity] no registration info found for ec2 instance, attempting registration Apr 24 23:58:40.708421 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [EC2Identity] EC2 registration was successful. Apr 24 23:58:40.708421 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [CredentialRefresher] credentialRefresher has started Apr 24 23:58:40.708421 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [CredentialRefresher] Starting credentials refresher loop Apr 24 23:58:40.708421 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO EC2RoleProvider Successfully connected with instance profile role credentials Apr 24 23:58:40.775975 amazon-ssm-agent[2165]: 2026-04-24 23:58:40 INFO [CredentialRefresher] Next credential rotation will be in 31.9083271085 minutes Apr 24 23:58:41.729166 amazon-ssm-agent[2165]: 2026-04-24 23:58:41 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker is not running, starting worker process Apr 24 23:58:41.828879 amazon-ssm-agent[2165]: 2026-04-24 23:58:41 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] [WorkerProvider] Worker ssm-agent-worker (pid:2185) started Apr 24 23:58:41.929759 amazon-ssm-agent[2165]: 2026-04-24 23:58:41 INFO [amazon-ssm-agent] [LongRunningWorkerContainer] Monitor long running worker health every 60 seconds Apr 24 23:58:42.093072 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:58:42.094498 systemd[1]: Reached target multi-user.target - Multi-User System. Apr 24 23:58:42.097846 systemd[1]: Startup finished in 610ms (kernel) + 6.726s (initrd) + 7.713s (userspace) = 15.051s. Apr 24 23:58:42.101251 (kubelet)[2201]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 24 23:58:42.364655 ntpd[1945]: Listen normally on 7 eth0 [fe80::4d8:cfff:fe33:ccdd%2]:123 Apr 24 23:58:42.366272 ntpd[1945]: 24 Apr 23:58:42 ntpd[1945]: Listen normally on 7 eth0 [fe80::4d8:cfff:fe33:ccdd%2]:123 Apr 24 23:58:42.368443 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Apr 24 23:58:42.374351 systemd[1]: Started sshd@0-172.31.31.110:22-4.175.71.9:40722.service - OpenSSH per-connection server daemon (4.175.71.9:40722). Apr 24 23:58:43.034828 kubelet[2201]: E0424 23:58:43.034765 2201 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 24 23:58:43.037663 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 24 23:58:43.037884 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 24 23:58:43.038693 systemd[1]: kubelet.service: Consumed 1.020s CPU time. Apr 24 23:58:43.413042 sshd[2211]: Accepted publickey for core from 4.175.71.9 port 40722 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 24 23:58:43.415394 sshd[2211]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:58:43.425018 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Apr 24 23:58:43.431582 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Apr 24 23:58:43.434648 systemd-logind[1954]: New session 1 of user core. Apr 24 23:58:43.448570 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Apr 24 23:58:43.455598 systemd[1]: Starting user@500.service - User Manager for UID 500... Apr 24 23:58:43.463595 (systemd)[2217]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Apr 24 23:58:43.582887 systemd[2217]: Queued start job for default target default.target. Apr 24 23:58:43.593071 systemd[2217]: Created slice app.slice - User Application Slice. Apr 24 23:58:43.593115 systemd[2217]: Reached target paths.target - Paths. Apr 24 23:58:43.593136 systemd[2217]: Reached target timers.target - Timers. Apr 24 23:58:43.594548 systemd[2217]: Starting dbus.socket - D-Bus User Message Bus Socket... Apr 24 23:58:43.607089 systemd[2217]: Listening on dbus.socket - D-Bus User Message Bus Socket. Apr 24 23:58:43.607219 systemd[2217]: Reached target sockets.target - Sockets. Apr 24 23:58:43.607234 systemd[2217]: Reached target basic.target - Basic System. Apr 24 23:58:43.607273 systemd[2217]: Reached target default.target - Main User Target. Apr 24 23:58:43.607303 systemd[2217]: Startup finished in 136ms. Apr 24 23:58:43.607392 systemd[1]: Started user@500.service - User Manager for UID 500. Apr 24 23:58:43.619167 systemd[1]: Started session-1.scope - Session 1 of User core. Apr 24 23:58:44.325192 systemd[1]: Started sshd@1-172.31.31.110:22-4.175.71.9:40736.service - OpenSSH per-connection server daemon (4.175.71.9:40736). Apr 24 23:58:45.297454 sshd[2228]: Accepted publickey for core from 4.175.71.9 port 40736 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 24 23:58:45.299268 sshd[2228]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:58:45.303855 systemd-logind[1954]: New session 2 of user core. Apr 24 23:58:45.315191 systemd[1]: Started session-2.scope - Session 2 of User core. Apr 24 23:58:45.976398 sshd[2228]: pam_unix(sshd:session): session closed for user core Apr 24 23:58:45.980522 systemd[1]: sshd@1-172.31.31.110:22-4.175.71.9:40736.service: Deactivated successfully. Apr 24 23:58:45.982584 systemd[1]: session-2.scope: Deactivated successfully. Apr 24 23:58:45.984313 systemd-logind[1954]: Session 2 logged out. Waiting for processes to exit. Apr 24 23:58:45.985531 systemd-logind[1954]: Removed session 2. Apr 24 23:58:46.147274 systemd[1]: Started sshd@2-172.31.31.110:22-4.175.71.9:41566.service - OpenSSH per-connection server daemon (4.175.71.9:41566). Apr 24 23:58:47.091908 sshd[2235]: Accepted publickey for core from 4.175.71.9 port 41566 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 24 23:58:47.093478 sshd[2235]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:58:47.099301 systemd-logind[1954]: New session 3 of user core. Apr 24 23:58:47.105054 systemd[1]: Started session-3.scope - Session 3 of User core. Apr 24 23:58:47.748564 sshd[2235]: pam_unix(sshd:session): session closed for user core Apr 24 23:58:47.753204 systemd-logind[1954]: Session 3 logged out. Waiting for processes to exit. Apr 24 23:58:47.754397 systemd[1]: sshd@2-172.31.31.110:22-4.175.71.9:41566.service: Deactivated successfully. Apr 24 23:58:47.756364 systemd[1]: session-3.scope: Deactivated successfully. Apr 24 23:58:47.757635 systemd-logind[1954]: Removed session 3. Apr 24 23:58:47.927063 systemd[1]: Started sshd@3-172.31.31.110:22-4.175.71.9:41576.service - OpenSSH per-connection server daemon (4.175.71.9:41576). Apr 24 23:58:48.934635 sshd[2242]: Accepted publickey for core from 4.175.71.9 port 41576 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 24 23:58:48.936223 sshd[2242]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:58:48.941444 systemd-logind[1954]: New session 4 of user core. Apr 24 23:58:48.947999 systemd[1]: Started session-4.scope - Session 4 of User core. Apr 24 23:58:49.635568 sshd[2242]: pam_unix(sshd:session): session closed for user core Apr 24 23:58:49.639961 systemd[1]: sshd@3-172.31.31.110:22-4.175.71.9:41576.service: Deactivated successfully. Apr 24 23:58:49.641918 systemd[1]: session-4.scope: Deactivated successfully. Apr 24 23:58:49.642608 systemd-logind[1954]: Session 4 logged out. Waiting for processes to exit. Apr 24 23:58:49.643647 systemd-logind[1954]: Removed session 4. Apr 24 23:58:49.813243 systemd[1]: Started sshd@4-172.31.31.110:22-4.175.71.9:41586.service - OpenSSH per-connection server daemon (4.175.71.9:41586). Apr 24 23:58:50.816708 sshd[2249]: Accepted publickey for core from 4.175.71.9 port 41586 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 24 23:58:50.818382 sshd[2249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:58:50.823539 systemd-logind[1954]: New session 5 of user core. Apr 24 23:58:50.833059 systemd[1]: Started session-5.scope - Session 5 of User core. Apr 24 23:58:51.367278 sudo[2252]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Apr 24 23:58:51.367676 sudo[2252]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:58:51.383686 sudo[2252]: pam_unix(sudo:session): session closed for user root Apr 24 23:58:51.548419 sshd[2249]: pam_unix(sshd:session): session closed for user core Apr 24 23:58:51.552376 systemd[1]: sshd@4-172.31.31.110:22-4.175.71.9:41586.service: Deactivated successfully. Apr 24 23:58:51.554411 systemd[1]: session-5.scope: Deactivated successfully. Apr 24 23:58:51.555820 systemd-logind[1954]: Session 5 logged out. Waiting for processes to exit. Apr 24 23:58:51.557075 systemd-logind[1954]: Removed session 5. Apr 24 23:58:51.726227 systemd[1]: Started sshd@5-172.31.31.110:22-4.175.71.9:41588.service - OpenSSH per-connection server daemon (4.175.71.9:41588). Apr 24 23:58:52.742960 sshd[2257]: Accepted publickey for core from 4.175.71.9 port 41588 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 24 23:58:52.743708 sshd[2257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:58:52.748278 systemd-logind[1954]: New session 6 of user core. Apr 24 23:58:52.755097 systemd[1]: Started session-6.scope - Session 6 of User core. Apr 24 23:58:53.139774 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Apr 24 23:58:53.145534 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:58:53.281920 sudo[2264]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Apr 24 23:58:53.282322 sudo[2264]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:58:53.288006 sudo[2264]: pam_unix(sudo:session): session closed for user root Apr 24 23:58:53.295542 sudo[2263]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Apr 24 23:58:53.296057 sudo[2263]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:58:53.318293 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Apr 24 23:58:53.320371 auditctl[2267]: No rules Apr 24 23:58:53.321415 systemd[1]: audit-rules.service: Deactivated successfully. Apr 24 23:58:53.322026 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Apr 24 23:58:53.332640 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 24 23:58:53.361003 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:58:53.372644 augenrules[2291]: No rules Apr 24 23:58:53.374395 (kubelet)[2288]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 24 23:58:53.374910 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 24 23:58:53.376113 sudo[2263]: pam_unix(sudo:session): session closed for user root Apr 24 23:58:53.418212 kubelet[2288]: E0424 23:58:53.417416 2288 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 24 23:58:53.422120 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 24 23:58:53.422311 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 24 23:58:53.543696 sshd[2257]: pam_unix(sshd:session): session closed for user core Apr 24 23:58:53.547354 systemd[1]: sshd@5-172.31.31.110:22-4.175.71.9:41588.service: Deactivated successfully. Apr 24 23:58:53.549346 systemd[1]: session-6.scope: Deactivated successfully. Apr 24 23:58:53.550886 systemd-logind[1954]: Session 6 logged out. Waiting for processes to exit. Apr 24 23:58:53.552302 systemd-logind[1954]: Removed session 6. Apr 24 23:58:53.714472 systemd[1]: Started sshd@6-172.31.31.110:22-4.175.71.9:41602.service - OpenSSH per-connection server daemon (4.175.71.9:41602). Apr 24 23:58:54.686832 sshd[2306]: Accepted publickey for core from 4.175.71.9 port 41602 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 24 23:58:54.688490 sshd[2306]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:58:54.694086 systemd-logind[1954]: New session 7 of user core. Apr 24 23:58:54.699027 systemd[1]: Started session-7.scope - Session 7 of User core. Apr 24 23:58:55.207145 sudo[2309]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Apr 24 23:58:55.207576 sudo[2309]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:58:55.712272 systemd[1]: Starting docker.service - Docker Application Container Engine... Apr 24 23:58:55.712425 (dockerd)[2325]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Apr 24 23:58:56.252850 dockerd[2325]: time="2026-04-24T23:58:56.252766076Z" level=info msg="Starting up" Apr 24 23:58:56.387393 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1794172247-merged.mount: Deactivated successfully. Apr 24 23:58:56.427054 dockerd[2325]: time="2026-04-24T23:58:56.427004926Z" level=info msg="Loading containers: start." Apr 24 23:58:56.598820 kernel: Initializing XFRM netlink socket Apr 24 23:58:56.667734 (udev-worker)[2347]: Network interface NamePolicy= disabled on kernel command line. Apr 24 23:58:56.739656 systemd-networkd[1892]: docker0: Link UP Apr 24 23:58:56.757528 dockerd[2325]: time="2026-04-24T23:58:56.757482810Z" level=info msg="Loading containers: done." Apr 24 23:58:56.786814 dockerd[2325]: time="2026-04-24T23:58:56.786566358Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Apr 24 23:58:56.786814 dockerd[2325]: time="2026-04-24T23:58:56.786679439Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Apr 24 23:58:56.787057 dockerd[2325]: time="2026-04-24T23:58:56.786843384Z" level=info msg="Daemon has completed initialization" Apr 24 23:58:56.822424 dockerd[2325]: time="2026-04-24T23:58:56.821459561Z" level=info msg="API listen on /run/docker.sock" Apr 24 23:58:56.821693 systemd[1]: Started docker.service - Docker Application Container Engine. Apr 24 23:58:57.701588 containerd[1974]: time="2026-04-24T23:58:57.701538264Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.4\"" Apr 24 23:58:58.242636 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4175040836.mount: Deactivated successfully. Apr 24 23:58:59.903402 containerd[1974]: time="2026-04-24T23:58:59.903343409Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:58:59.905147 containerd[1974]: time="2026-04-24T23:58:59.905093703Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.4: active requests=0, bytes read=27579423" Apr 24 23:58:59.909808 containerd[1974]: time="2026-04-24T23:58:59.908178370Z" level=info msg="ImageCreate event name:\"sha256:840f22aa169cc9a11114a874832f60c2d4a4f7767d107303cd1ca6d9c228ee8b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:58:59.912756 containerd[1974]: time="2026-04-24T23:58:59.912712068Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:06b4bb208634a107ab9e6c50cdb9df178d05166a700c0cc448d59522091074b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:58:59.913919 containerd[1974]: time="2026-04-24T23:58:59.913872310Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.4\" with image id \"sha256:840f22aa169cc9a11114a874832f60c2d4a4f7767d107303cd1ca6d9c228ee8b\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:06b4bb208634a107ab9e6c50cdb9df178d05166a700c0cc448d59522091074b5\", size \"27576022\" in 2.212290018s" Apr 24 23:58:59.914068 containerd[1974]: time="2026-04-24T23:58:59.914046652Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.4\" returns image reference \"sha256:840f22aa169cc9a11114a874832f60c2d4a4f7767d107303cd1ca6d9c228ee8b\"" Apr 24 23:58:59.914955 containerd[1974]: time="2026-04-24T23:58:59.914928036Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.4\"" Apr 24 23:59:01.874064 containerd[1974]: time="2026-04-24T23:59:01.874000559Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:01.875741 containerd[1974]: time="2026-04-24T23:59:01.875686034Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.4: active requests=0, bytes read=21451659" Apr 24 23:59:01.877035 containerd[1974]: time="2026-04-24T23:59:01.876968232Z" level=info msg="ImageCreate event name:\"sha256:96ce7469899d4d3ccad56b1a80b91609cb2203287112d73818296004948bb667\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:01.882100 containerd[1974]: time="2026-04-24T23:59:01.881594508Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7b036c805d57f203e9efaf43672cff6019b9083a9c0eb107ea8500eace29d8fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:01.883397 containerd[1974]: time="2026-04-24T23:59:01.883339750Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.4\" with image id \"sha256:96ce7469899d4d3ccad56b1a80b91609cb2203287112d73818296004948bb667\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7b036c805d57f203e9efaf43672cff6019b9083a9c0eb107ea8500eace29d8fd\", size \"23018006\" in 1.968374535s" Apr 24 23:59:01.883758 containerd[1974]: time="2026-04-24T23:59:01.883565048Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.4\" returns image reference \"sha256:96ce7469899d4d3ccad56b1a80b91609cb2203287112d73818296004948bb667\"" Apr 24 23:59:01.884608 containerd[1974]: time="2026-04-24T23:59:01.884558691Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.4\"" Apr 24 23:59:03.089886 containerd[1974]: time="2026-04-24T23:59:03.089831931Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:03.091178 containerd[1974]: time="2026-04-24T23:59:03.091124765Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.4: active requests=0, bytes read=15555290" Apr 24 23:59:03.092822 containerd[1974]: time="2026-04-24T23:59:03.092577484Z" level=info msg="ImageCreate event name:\"sha256:a0eecd9b69a38f829c29b535f73c1a3de3c7cc9f1294a44dc42c808faf0a23ff\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:03.101722 containerd[1974]: time="2026-04-24T23:59:03.100905747Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.4\" with image id \"sha256:a0eecd9b69a38f829c29b535f73c1a3de3c7cc9f1294a44dc42c808faf0a23ff\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9054fecb4fa04cc63aec47b0913c8deb3487d414190cd15211f864cfe0d0b4d6\", size \"17121655\" in 1.216310977s" Apr 24 23:59:03.101722 containerd[1974]: time="2026-04-24T23:59:03.100985342Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.4\" returns image reference \"sha256:a0eecd9b69a38f829c29b535f73c1a3de3c7cc9f1294a44dc42c808faf0a23ff\"" Apr 24 23:59:03.101722 containerd[1974]: time="2026-04-24T23:59:03.101291511Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9054fecb4fa04cc63aec47b0913c8deb3487d414190cd15211f864cfe0d0b4d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:03.102465 containerd[1974]: time="2026-04-24T23:59:03.102333951Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.4\"" Apr 24 23:59:03.641218 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Apr 24 23:59:03.652156 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:59:04.056980 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:59:04.063693 (kubelet)[2541]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 24 23:59:04.134831 kubelet[2541]: E0424 23:59:04.132740 2541 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 24 23:59:04.135638 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 24 23:59:04.135858 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 24 23:59:04.276367 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3885892825.mount: Deactivated successfully. Apr 24 23:59:04.698252 containerd[1974]: time="2026-04-24T23:59:04.698197086Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:04.706666 containerd[1974]: time="2026-04-24T23:59:04.706595370Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.4: active requests=0, bytes read=25699925" Apr 24 23:59:04.715197 containerd[1974]: time="2026-04-24T23:59:04.715052518Z" level=info msg="ImageCreate event name:\"sha256:f21f27cddb23d0d7131dc7c59666b3b0e0b5ca4c3f003225f90307ab6211b6e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:04.724985 containerd[1974]: time="2026-04-24T23:59:04.724760813Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:c5daa23c72474e5e4062c320177d3b485fd42e7010f052bc80d657c4c00a0672\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:04.725860 containerd[1974]: time="2026-04-24T23:59:04.725639393Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.4\" with image id \"sha256:f21f27cddb23d0d7131dc7c59666b3b0e0b5ca4c3f003225f90307ab6211b6e1\", repo tag \"registry.k8s.io/kube-proxy:v1.35.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:c5daa23c72474e5e4062c320177d3b485fd42e7010f052bc80d657c4c00a0672\", size \"25698944\" in 1.62326376s" Apr 24 23:59:04.725860 containerd[1974]: time="2026-04-24T23:59:04.725684164Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.4\" returns image reference \"sha256:f21f27cddb23d0d7131dc7c59666b3b0e0b5ca4c3f003225f90307ab6211b6e1\"" Apr 24 23:59:04.726221 containerd[1974]: time="2026-04-24T23:59:04.726190013Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Apr 24 23:59:05.256874 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1008230547.mount: Deactivated successfully. Apr 24 23:59:06.611696 containerd[1974]: time="2026-04-24T23:59:06.611636761Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:06.614812 containerd[1974]: time="2026-04-24T23:59:06.614642546Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=23556542" Apr 24 23:59:06.617486 containerd[1974]: time="2026-04-24T23:59:06.617403775Z" level=info msg="ImageCreate event name:\"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:06.623479 containerd[1974]: time="2026-04-24T23:59:06.623420077Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:06.624811 containerd[1974]: time="2026-04-24T23:59:06.624610762Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"23553139\" in 1.898374938s" Apr 24 23:59:06.624811 containerd[1974]: time="2026-04-24T23:59:06.624656039Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\"" Apr 24 23:59:06.625357 containerd[1974]: time="2026-04-24T23:59:06.625211897Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Apr 24 23:59:07.080130 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3021342676.mount: Deactivated successfully. Apr 24 23:59:07.087331 containerd[1974]: time="2026-04-24T23:59:07.087268331Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:07.088455 containerd[1974]: time="2026-04-24T23:59:07.088394651Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321218" Apr 24 23:59:07.089847 containerd[1974]: time="2026-04-24T23:59:07.089745480Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:07.093002 containerd[1974]: time="2026-04-24T23:59:07.092939861Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:07.094076 containerd[1974]: time="2026-04-24T23:59:07.093899889Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 468.302194ms" Apr 24 23:59:07.094076 containerd[1974]: time="2026-04-24T23:59:07.093938985Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Apr 24 23:59:07.094774 containerd[1974]: time="2026-04-24T23:59:07.094591037Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Apr 24 23:59:07.581191 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3497417525.mount: Deactivated successfully. Apr 24 23:59:08.615033 containerd[1974]: time="2026-04-24T23:59:08.614961869Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:08.618176 containerd[1974]: time="2026-04-24T23:59:08.618107819Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=23644465" Apr 24 23:59:08.625334 containerd[1974]: time="2026-04-24T23:59:08.625286159Z" level=info msg="ImageCreate event name:\"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:08.634604 containerd[1974]: time="2026-04-24T23:59:08.634497207Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:08.636556 containerd[1974]: time="2026-04-24T23:59:08.636006097Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"23641797\" in 1.54138022s" Apr 24 23:59:08.636556 containerd[1974]: time="2026-04-24T23:59:08.636052562Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\"" Apr 24 23:59:09.009248 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Apr 24 23:59:10.106175 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:59:10.112139 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:59:10.151859 systemd[1]: Reloading requested from client PID 2703 ('systemctl') (unit session-7.scope)... Apr 24 23:59:10.151879 systemd[1]: Reloading... Apr 24 23:59:10.255821 zram_generator::config[2739]: No configuration found. Apr 24 23:59:10.418310 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:59:10.520668 systemd[1]: Reloading finished in 368 ms. Apr 24 23:59:10.584517 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Apr 24 23:59:10.584623 systemd[1]: kubelet.service: Failed with result 'signal'. Apr 24 23:59:10.585041 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:59:10.587722 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:59:11.131169 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:59:11.141330 (kubelet)[2807]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 24 23:59:11.198509 kubelet[2807]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:59:11.348929 kubelet[2807]: I0424 23:59:11.348386 2807 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Apr 24 23:59:11.348929 kubelet[2807]: I0424 23:59:11.348692 2807 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:59:11.350256 kubelet[2807]: I0424 23:59:11.350229 2807 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 24 23:59:11.350256 kubelet[2807]: I0424 23:59:11.350252 2807 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:59:11.350577 kubelet[2807]: I0424 23:59:11.350558 2807 server.go:951] "Client rotation is on, will bootstrap in background" Apr 24 23:59:11.357709 kubelet[2807]: I0424 23:59:11.357564 2807 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 24 23:59:11.361819 kubelet[2807]: E0424 23:59:11.360976 2807 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://172.31.31.110:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.31.31.110:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Apr 24 23:59:11.366565 kubelet[2807]: E0424 23:59:11.366496 2807 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 24 23:59:11.366565 kubelet[2807]: I0424 23:59:11.366569 2807 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Apr 24 23:59:11.371662 kubelet[2807]: I0424 23:59:11.371539 2807 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 24 23:59:11.374508 kubelet[2807]: I0424 23:59:11.374434 2807 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:59:11.374730 kubelet[2807]: I0424 23:59:11.374498 2807 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-31-110","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:59:11.374885 kubelet[2807]: I0424 23:59:11.374736 2807 topology_manager.go:143] "Creating topology manager with none policy" Apr 24 23:59:11.374885 kubelet[2807]: I0424 23:59:11.374752 2807 container_manager_linux.go:308] "Creating device plugin manager" Apr 24 23:59:11.374978 kubelet[2807]: I0424 23:59:11.374898 2807 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Apr 24 23:59:11.378490 kubelet[2807]: I0424 23:59:11.378438 2807 state_mem.go:41] "Initialized" logger="CPUManager state memory" Apr 24 23:59:11.378744 kubelet[2807]: I0424 23:59:11.378711 2807 kubelet.go:482] "Attempting to sync node with API server" Apr 24 23:59:11.378835 kubelet[2807]: I0424 23:59:11.378750 2807 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:59:11.378835 kubelet[2807]: I0424 23:59:11.378797 2807 kubelet.go:394] "Adding apiserver pod source" Apr 24 23:59:11.378835 kubelet[2807]: I0424 23:59:11.378812 2807 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:59:11.383418 kubelet[2807]: I0424 23:59:11.382745 2807 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 24 23:59:11.386984 kubelet[2807]: I0424 23:59:11.386580 2807 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:59:11.386984 kubelet[2807]: I0424 23:59:11.386640 2807 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 24 23:59:11.386984 kubelet[2807]: W0424 23:59:11.386716 2807 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Apr 24 23:59:11.394944 kubelet[2807]: I0424 23:59:11.394633 2807 server.go:1257] "Started kubelet" Apr 24 23:59:11.407310 kubelet[2807]: I0424 23:59:11.407128 2807 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Apr 24 23:59:11.407456 kubelet[2807]: E0424 23:59:11.405378 2807 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.31.110:6443/api/v1/namespaces/default/events\": dial tcp 172.31.31.110:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-31-110.18a9706f520bdc30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-31-110,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-31-110,},FirstTimestamp:2026-04-24 23:59:11.394561072 +0000 UTC m=+0.248398413,LastTimestamp:2026-04-24 23:59:11.394561072 +0000 UTC m=+0.248398413,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-31-110,}" Apr 24 23:59:11.410818 kubelet[2807]: I0424 23:59:11.410520 2807 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:59:11.411773 kubelet[2807]: I0424 23:59:11.411747 2807 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:59:11.419497 kubelet[2807]: I0424 23:59:11.418763 2807 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:59:11.419497 kubelet[2807]: I0424 23:59:11.418999 2807 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 24 23:59:11.419497 kubelet[2807]: I0424 23:59:11.419208 2807 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:59:11.420857 kubelet[2807]: I0424 23:59:11.419871 2807 volume_manager.go:311] "Starting Kubelet Volume Manager" Apr 24 23:59:11.420857 kubelet[2807]: E0424 23:59:11.420205 2807 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-31-110\" not found" Apr 24 23:59:11.420857 kubelet[2807]: I0424 23:59:11.420593 2807 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 24 23:59:11.420857 kubelet[2807]: I0424 23:59:11.420649 2807 reconciler.go:29] "Reconciler: start to sync state" Apr 24 23:59:11.421672 kubelet[2807]: I0424 23:59:11.421639 2807 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 24 23:59:11.426095 kubelet[2807]: E0424 23:59:11.426061 2807 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.31.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-110?timeout=10s\": dial tcp 172.31.31.110:6443: connect: connection refused" interval="200ms" Apr 24 23:59:11.426389 kubelet[2807]: I0424 23:59:11.426362 2807 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:59:11.426497 kubelet[2807]: I0424 23:59:11.426467 2807 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 24 23:59:11.429830 kubelet[2807]: I0424 23:59:11.429019 2807 factory.go:223] Registration of the containerd container factory successfully Apr 24 23:59:11.438162 kubelet[2807]: I0424 23:59:11.438108 2807 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 24 23:59:11.439877 kubelet[2807]: I0424 23:59:11.439861 2807 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 24 23:59:11.441820 kubelet[2807]: I0424 23:59:11.441429 2807 status_manager.go:249] "Starting to sync pod status with apiserver" Apr 24 23:59:11.441820 kubelet[2807]: I0424 23:59:11.441472 2807 kubelet.go:2501] "Starting kubelet main sync loop" Apr 24 23:59:11.441820 kubelet[2807]: E0424 23:59:11.441534 2807 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 24 23:59:11.450425 kubelet[2807]: E0424 23:59:11.450391 2807 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 24 23:59:11.466324 kubelet[2807]: I0424 23:59:11.466302 2807 cpu_manager.go:225] "Starting" policy="none" Apr 24 23:59:11.466825 kubelet[2807]: I0424 23:59:11.466468 2807 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Apr 24 23:59:11.466825 kubelet[2807]: I0424 23:59:11.466495 2807 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Apr 24 23:59:11.470882 kubelet[2807]: I0424 23:59:11.470850 2807 policy_none.go:50] "Start" Apr 24 23:59:11.470882 kubelet[2807]: I0424 23:59:11.470872 2807 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 24 23:59:11.470882 kubelet[2807]: I0424 23:59:11.470886 2807 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 24 23:59:11.473902 kubelet[2807]: I0424 23:59:11.473873 2807 policy_none.go:44] "Start" Apr 24 23:59:11.478612 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Apr 24 23:59:11.487087 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Apr 24 23:59:11.491466 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Apr 24 23:59:11.501923 kubelet[2807]: E0424 23:59:11.501891 2807 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:59:11.502160 kubelet[2807]: I0424 23:59:11.502142 2807 eviction_manager.go:194] "Eviction manager: starting control loop" Apr 24 23:59:11.502230 kubelet[2807]: I0424 23:59:11.502161 2807 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:59:11.502775 kubelet[2807]: I0424 23:59:11.502701 2807 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Apr 24 23:59:11.504655 kubelet[2807]: E0424 23:59:11.504450 2807 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 24 23:59:11.504655 kubelet[2807]: E0424 23:59:11.504494 2807 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-31-110\" not found" Apr 24 23:59:11.555831 systemd[1]: Created slice kubepods-burstable-podd8fac2cc404e1e6c526ab49980b493d0.slice - libcontainer container kubepods-burstable-podd8fac2cc404e1e6c526ab49980b493d0.slice. Apr 24 23:59:11.570555 kubelet[2807]: E0424 23:59:11.570505 2807 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-110\" not found" node="ip-172-31-31-110" Apr 24 23:59:11.574530 systemd[1]: Created slice kubepods-burstable-pod5a1c8eb3a314b4cc99733858627d4848.slice - libcontainer container kubepods-burstable-pod5a1c8eb3a314b4cc99733858627d4848.slice. Apr 24 23:59:11.579957 kubelet[2807]: E0424 23:59:11.579918 2807 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-110\" not found" node="ip-172-31-31-110" Apr 24 23:59:11.581982 systemd[1]: Created slice kubepods-burstable-podf42206aab93a5b9440ca1d36d5fc33be.slice - libcontainer container kubepods-burstable-podf42206aab93a5b9440ca1d36d5fc33be.slice. Apr 24 23:59:11.586365 kubelet[2807]: E0424 23:59:11.586330 2807 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-110\" not found" node="ip-172-31-31-110" Apr 24 23:59:11.604084 kubelet[2807]: I0424 23:59:11.604023 2807 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-31-110" Apr 24 23:59:11.604454 kubelet[2807]: E0424 23:59:11.604422 2807 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://172.31.31.110:6443/api/v1/nodes\": dial tcp 172.31.31.110:6443: connect: connection refused" node="ip-172-31-31-110" Apr 24 23:59:11.627394 kubelet[2807]: E0424 23:59:11.627341 2807 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.31.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-110?timeout=10s\": dial tcp 172.31.31.110:6443: connect: connection refused" interval="400ms" Apr 24 23:59:11.722549 kubelet[2807]: I0424 23:59:11.722502 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d8fac2cc404e1e6c526ab49980b493d0-ca-certs\") pod \"kube-apiserver-ip-172-31-31-110\" (UID: \"d8fac2cc404e1e6c526ab49980b493d0\") " pod="kube-system/kube-apiserver-ip-172-31-31-110" Apr 24 23:59:11.722827 kubelet[2807]: I0424 23:59:11.722557 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d8fac2cc404e1e6c526ab49980b493d0-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-31-110\" (UID: \"d8fac2cc404e1e6c526ab49980b493d0\") " pod="kube-system/kube-apiserver-ip-172-31-31-110" Apr 24 23:59:11.722827 kubelet[2807]: I0424 23:59:11.722598 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5a1c8eb3a314b4cc99733858627d4848-ca-certs\") pod \"kube-controller-manager-ip-172-31-31-110\" (UID: \"5a1c8eb3a314b4cc99733858627d4848\") " pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:11.722827 kubelet[2807]: I0424 23:59:11.722623 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5a1c8eb3a314b4cc99733858627d4848-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-31-110\" (UID: \"5a1c8eb3a314b4cc99733858627d4848\") " pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:11.722827 kubelet[2807]: I0424 23:59:11.722649 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5a1c8eb3a314b4cc99733858627d4848-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-31-110\" (UID: \"5a1c8eb3a314b4cc99733858627d4848\") " pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:11.722827 kubelet[2807]: I0424 23:59:11.722674 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f42206aab93a5b9440ca1d36d5fc33be-kubeconfig\") pod \"kube-scheduler-ip-172-31-31-110\" (UID: \"f42206aab93a5b9440ca1d36d5fc33be\") " pod="kube-system/kube-scheduler-ip-172-31-31-110" Apr 24 23:59:11.722983 kubelet[2807]: I0424 23:59:11.722702 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d8fac2cc404e1e6c526ab49980b493d0-k8s-certs\") pod \"kube-apiserver-ip-172-31-31-110\" (UID: \"d8fac2cc404e1e6c526ab49980b493d0\") " pod="kube-system/kube-apiserver-ip-172-31-31-110" Apr 24 23:59:11.722983 kubelet[2807]: I0424 23:59:11.722726 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5a1c8eb3a314b4cc99733858627d4848-k8s-certs\") pod \"kube-controller-manager-ip-172-31-31-110\" (UID: \"5a1c8eb3a314b4cc99733858627d4848\") " pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:11.722983 kubelet[2807]: I0424 23:59:11.722747 2807 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5a1c8eb3a314b4cc99733858627d4848-kubeconfig\") pod \"kube-controller-manager-ip-172-31-31-110\" (UID: \"5a1c8eb3a314b4cc99733858627d4848\") " pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:11.806122 kubelet[2807]: I0424 23:59:11.806085 2807 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-31-110" Apr 24 23:59:11.806482 kubelet[2807]: E0424 23:59:11.806439 2807 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://172.31.31.110:6443/api/v1/nodes\": dial tcp 172.31.31.110:6443: connect: connection refused" node="ip-172-31-31-110" Apr 24 23:59:11.874641 containerd[1974]: time="2026-04-24T23:59:11.874584030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-31-110,Uid:d8fac2cc404e1e6c526ab49980b493d0,Namespace:kube-system,Attempt:0,}" Apr 24 23:59:11.882996 containerd[1974]: time="2026-04-24T23:59:11.882943425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-31-110,Uid:5a1c8eb3a314b4cc99733858627d4848,Namespace:kube-system,Attempt:0,}" Apr 24 23:59:11.889435 containerd[1974]: time="2026-04-24T23:59:11.889393593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-31-110,Uid:f42206aab93a5b9440ca1d36d5fc33be,Namespace:kube-system,Attempt:0,}" Apr 24 23:59:12.028515 kubelet[2807]: E0424 23:59:12.028379 2807 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.31.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-110?timeout=10s\": dial tcp 172.31.31.110:6443: connect: connection refused" interval="800ms" Apr 24 23:59:12.208692 kubelet[2807]: I0424 23:59:12.208660 2807 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-31-110" Apr 24 23:59:12.209140 kubelet[2807]: E0424 23:59:12.209043 2807 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://172.31.31.110:6443/api/v1/nodes\": dial tcp 172.31.31.110:6443: connect: connection refused" node="ip-172-31-31-110" Apr 24 23:59:12.321733 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount166997629.mount: Deactivated successfully. Apr 24 23:59:12.332439 containerd[1974]: time="2026-04-24T23:59:12.332380624Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:59:12.333514 containerd[1974]: time="2026-04-24T23:59:12.333463431Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Apr 24 23:59:12.334920 containerd[1974]: time="2026-04-24T23:59:12.334881919Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:59:12.336447 containerd[1974]: time="2026-04-24T23:59:12.336415210Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:59:12.337839 containerd[1974]: time="2026-04-24T23:59:12.337775796Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 24 23:59:12.339325 containerd[1974]: time="2026-04-24T23:59:12.339286657Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:59:12.340851 containerd[1974]: time="2026-04-24T23:59:12.340710179Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 24 23:59:12.344013 containerd[1974]: time="2026-04-24T23:59:12.343976829Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:59:12.346879 containerd[1974]: time="2026-04-24T23:59:12.344892643Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 461.865263ms" Apr 24 23:59:12.347211 containerd[1974]: time="2026-04-24T23:59:12.347176778Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 457.700561ms" Apr 24 23:59:12.350591 containerd[1974]: time="2026-04-24T23:59:12.350555304Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 475.871188ms" Apr 24 23:59:12.600604 containerd[1974]: time="2026-04-24T23:59:12.600417714Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:59:12.601056 containerd[1974]: time="2026-04-24T23:59:12.600812742Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:59:12.601296 containerd[1974]: time="2026-04-24T23:59:12.600887493Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:12.601296 containerd[1974]: time="2026-04-24T23:59:12.601003333Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:12.607502 containerd[1974]: time="2026-04-24T23:59:12.607200135Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:59:12.607502 containerd[1974]: time="2026-04-24T23:59:12.607263789Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:59:12.607502 containerd[1974]: time="2026-04-24T23:59:12.607287369Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:12.607502 containerd[1974]: time="2026-04-24T23:59:12.607360255Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:12.623727 containerd[1974]: time="2026-04-24T23:59:12.623367754Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:59:12.624028 containerd[1974]: time="2026-04-24T23:59:12.623909236Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:59:12.624243 containerd[1974]: time="2026-04-24T23:59:12.624009094Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:12.624465 containerd[1974]: time="2026-04-24T23:59:12.624431888Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:12.649404 systemd[1]: Started cri-containerd-5dd646876710cede4cb79e81db6e93617b9c6753564ad2d2f6ff070bc89e8252.scope - libcontainer container 5dd646876710cede4cb79e81db6e93617b9c6753564ad2d2f6ff070bc89e8252. Apr 24 23:59:12.652240 systemd[1]: Started cri-containerd-c387f145c3c4fed47b925953b0430136e34b31b3dfe77831c7b274e29c0748ad.scope - libcontainer container c387f145c3c4fed47b925953b0430136e34b31b3dfe77831c7b274e29c0748ad. Apr 24 23:59:12.670064 systemd[1]: Started cri-containerd-649415c480f9cbf9c235f11a830acda218ee254dbb0765dd5d461880729993aa.scope - libcontainer container 649415c480f9cbf9c235f11a830acda218ee254dbb0765dd5d461880729993aa. Apr 24 23:59:12.746726 containerd[1974]: time="2026-04-24T23:59:12.746474537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-31-110,Uid:d8fac2cc404e1e6c526ab49980b493d0,Namespace:kube-system,Attempt:0,} returns sandbox id \"c387f145c3c4fed47b925953b0430136e34b31b3dfe77831c7b274e29c0748ad\"" Apr 24 23:59:12.762179 containerd[1974]: time="2026-04-24T23:59:12.762138237Z" level=info msg="CreateContainer within sandbox \"c387f145c3c4fed47b925953b0430136e34b31b3dfe77831c7b274e29c0748ad\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Apr 24 23:59:12.779833 containerd[1974]: time="2026-04-24T23:59:12.778078574Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-31-110,Uid:5a1c8eb3a314b4cc99733858627d4848,Namespace:kube-system,Attempt:0,} returns sandbox id \"649415c480f9cbf9c235f11a830acda218ee254dbb0765dd5d461880729993aa\"" Apr 24 23:59:12.784889 containerd[1974]: time="2026-04-24T23:59:12.784756869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-31-110,Uid:f42206aab93a5b9440ca1d36d5fc33be,Namespace:kube-system,Attempt:0,} returns sandbox id \"5dd646876710cede4cb79e81db6e93617b9c6753564ad2d2f6ff070bc89e8252\"" Apr 24 23:59:12.789056 containerd[1974]: time="2026-04-24T23:59:12.789010315Z" level=info msg="CreateContainer within sandbox \"649415c480f9cbf9c235f11a830acda218ee254dbb0765dd5d461880729993aa\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Apr 24 23:59:12.793626 containerd[1974]: time="2026-04-24T23:59:12.793528496Z" level=info msg="CreateContainer within sandbox \"5dd646876710cede4cb79e81db6e93617b9c6753564ad2d2f6ff070bc89e8252\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Apr 24 23:59:12.827326 containerd[1974]: time="2026-04-24T23:59:12.827280434Z" level=info msg="CreateContainer within sandbox \"5dd646876710cede4cb79e81db6e93617b9c6753564ad2d2f6ff070bc89e8252\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"6b89e68433ed79e525e9aa4d163517e371c703cc868aa6e33df1b161bcf93be9\"" Apr 24 23:59:12.829347 kubelet[2807]: E0424 23:59:12.829186 2807 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.31.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-110?timeout=10s\": dial tcp 172.31.31.110:6443: connect: connection refused" interval="1.6s" Apr 24 23:59:12.829446 containerd[1974]: time="2026-04-24T23:59:12.829249661Z" level=info msg="CreateContainer within sandbox \"c387f145c3c4fed47b925953b0430136e34b31b3dfe77831c7b274e29c0748ad\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"04ff7b3af2a5e9e7f8ec7277e094f981a40ac69eb27638cdfab90f70f31d4db8\"" Apr 24 23:59:12.830873 containerd[1974]: time="2026-04-24T23:59:12.829873011Z" level=info msg="StartContainer for \"04ff7b3af2a5e9e7f8ec7277e094f981a40ac69eb27638cdfab90f70f31d4db8\"" Apr 24 23:59:12.833820 containerd[1974]: time="2026-04-24T23:59:12.832944249Z" level=info msg="CreateContainer within sandbox \"649415c480f9cbf9c235f11a830acda218ee254dbb0765dd5d461880729993aa\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"c418045851606f1ef65eca56b3572f07751409a908e3af94b9f106a0411550d9\"" Apr 24 23:59:12.833820 containerd[1974]: time="2026-04-24T23:59:12.833108530Z" level=info msg="StartContainer for \"6b89e68433ed79e525e9aa4d163517e371c703cc868aa6e33df1b161bcf93be9\"" Apr 24 23:59:12.847133 containerd[1974]: time="2026-04-24T23:59:12.847083751Z" level=info msg="StartContainer for \"c418045851606f1ef65eca56b3572f07751409a908e3af94b9f106a0411550d9\"" Apr 24 23:59:12.871548 systemd[1]: Started cri-containerd-04ff7b3af2a5e9e7f8ec7277e094f981a40ac69eb27638cdfab90f70f31d4db8.scope - libcontainer container 04ff7b3af2a5e9e7f8ec7277e094f981a40ac69eb27638cdfab90f70f31d4db8. Apr 24 23:59:12.883083 systemd[1]: Started cri-containerd-6b89e68433ed79e525e9aa4d163517e371c703cc868aa6e33df1b161bcf93be9.scope - libcontainer container 6b89e68433ed79e525e9aa4d163517e371c703cc868aa6e33df1b161bcf93be9. Apr 24 23:59:12.924006 systemd[1]: Started cri-containerd-c418045851606f1ef65eca56b3572f07751409a908e3af94b9f106a0411550d9.scope - libcontainer container c418045851606f1ef65eca56b3572f07751409a908e3af94b9f106a0411550d9. Apr 24 23:59:12.960966 containerd[1974]: time="2026-04-24T23:59:12.960916415Z" level=info msg="StartContainer for \"04ff7b3af2a5e9e7f8ec7277e094f981a40ac69eb27638cdfab90f70f31d4db8\" returns successfully" Apr 24 23:59:13.013823 kubelet[2807]: I0424 23:59:13.013655 2807 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-31-110" Apr 24 23:59:13.015190 kubelet[2807]: E0424 23:59:13.015082 2807 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://172.31.31.110:6443/api/v1/nodes\": dial tcp 172.31.31.110:6443: connect: connection refused" node="ip-172-31-31-110" Apr 24 23:59:13.025402 containerd[1974]: time="2026-04-24T23:59:13.025356435Z" level=info msg="StartContainer for \"6b89e68433ed79e525e9aa4d163517e371c703cc868aa6e33df1b161bcf93be9\" returns successfully" Apr 24 23:59:13.036005 containerd[1974]: time="2026-04-24T23:59:13.035958988Z" level=info msg="StartContainer for \"c418045851606f1ef65eca56b3572f07751409a908e3af94b9f106a0411550d9\" returns successfully" Apr 24 23:59:13.474424 kubelet[2807]: E0424 23:59:13.474390 2807 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-110\" not found" node="ip-172-31-31-110" Apr 24 23:59:13.478809 kubelet[2807]: E0424 23:59:13.478757 2807 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-110\" not found" node="ip-172-31-31-110" Apr 24 23:59:13.482155 kubelet[2807]: E0424 23:59:13.482124 2807 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-110\" not found" node="ip-172-31-31-110" Apr 24 23:59:14.458037 kubelet[2807]: E0424 23:59:14.457985 2807 nodelease.go:50] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-31-110\" not found" node="ip-172-31-31-110" Apr 24 23:59:14.483204 kubelet[2807]: E0424 23:59:14.483175 2807 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-110\" not found" node="ip-172-31-31-110" Apr 24 23:59:14.483629 kubelet[2807]: E0424 23:59:14.483493 2807 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-110\" not found" node="ip-172-31-31-110" Apr 24 23:59:14.618332 kubelet[2807]: I0424 23:59:14.618295 2807 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-31-110" Apr 24 23:59:14.648465 kubelet[2807]: I0424 23:59:14.647428 2807 kubelet_node_status.go:77] "Successfully registered node" node="ip-172-31-31-110" Apr 24 23:59:14.648465 kubelet[2807]: E0424 23:59:14.647474 2807 kubelet_node_status.go:474] "Error updating node status, will retry" err="error getting node \"ip-172-31-31-110\": node \"ip-172-31-31-110\" not found" Apr 24 23:59:14.662288 kubelet[2807]: E0424 23:59:14.662233 2807 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-31-110\" not found" Apr 24 23:59:14.762848 kubelet[2807]: E0424 23:59:14.762706 2807 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-31-110\" not found" Apr 24 23:59:14.863854 kubelet[2807]: E0424 23:59:14.863809 2807 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-31-110\" not found" Apr 24 23:59:14.964008 kubelet[2807]: E0424 23:59:14.963962 2807 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-31-110\" not found" Apr 24 23:59:15.064873 kubelet[2807]: E0424 23:59:15.064700 2807 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-31-110\" not found" Apr 24 23:59:15.165813 kubelet[2807]: E0424 23:59:15.165744 2807 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-31-110\" not found" Apr 24 23:59:15.266831 kubelet[2807]: E0424 23:59:15.266771 2807 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-31-110\" not found" Apr 24 23:59:15.367857 kubelet[2807]: E0424 23:59:15.367738 2807 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-31-110\" not found" Apr 24 23:59:15.467933 kubelet[2807]: E0424 23:59:15.467883 2807 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-31-110\" not found" Apr 24 23:59:15.486214 kubelet[2807]: E0424 23:59:15.486179 2807 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-172-31-31-110\" not found" node="ip-172-31-31-110" Apr 24 23:59:15.568452 kubelet[2807]: E0424 23:59:15.568406 2807 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ip-172-31-31-110\" not found" Apr 24 23:59:15.622021 kubelet[2807]: I0424 23:59:15.621907 2807 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-31-110" Apr 24 23:59:15.640287 kubelet[2807]: I0424 23:59:15.640246 2807 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:15.654568 kubelet[2807]: I0424 23:59:15.652193 2807 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-31-110" Apr 24 23:59:16.383660 kubelet[2807]: I0424 23:59:16.383615 2807 apiserver.go:52] "Watching apiserver" Apr 24 23:59:16.421484 kubelet[2807]: I0424 23:59:16.421448 2807 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 24 23:59:16.630565 systemd[1]: Reloading requested from client PID 3088 ('systemctl') (unit session-7.scope)... Apr 24 23:59:16.630584 systemd[1]: Reloading... Apr 24 23:59:16.734904 zram_generator::config[3128]: No configuration found. Apr 24 23:59:16.872878 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:59:16.975886 systemd[1]: Reloading finished in 344 ms. Apr 24 23:59:17.022288 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:59:17.036900 systemd[1]: kubelet.service: Deactivated successfully. Apr 24 23:59:17.037145 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:59:17.050384 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:59:17.313299 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:59:17.325391 (kubelet)[3188]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 24 23:59:17.396689 kubelet[3188]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:59:17.411644 kubelet[3188]: I0424 23:59:17.411588 3188 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Apr 24 23:59:17.411644 kubelet[3188]: I0424 23:59:17.411636 3188 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:59:17.413410 kubelet[3188]: I0424 23:59:17.413381 3188 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 24 23:59:17.413410 kubelet[3188]: I0424 23:59:17.413408 3188 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:59:17.413854 kubelet[3188]: I0424 23:59:17.413831 3188 server.go:951] "Client rotation is on, will bootstrap in background" Apr 24 23:59:17.415403 kubelet[3188]: I0424 23:59:17.415370 3188 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Apr 24 23:59:17.430481 kubelet[3188]: I0424 23:59:17.430428 3188 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 24 23:59:17.438915 kubelet[3188]: E0424 23:59:17.438861 3188 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 24 23:59:17.439062 kubelet[3188]: I0424 23:59:17.438942 3188 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Apr 24 23:59:17.443436 kubelet[3188]: I0424 23:59:17.443408 3188 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 24 23:59:17.443709 kubelet[3188]: I0424 23:59:17.443659 3188 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:59:17.447414 kubelet[3188]: I0424 23:59:17.443702 3188 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-31-110","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:59:17.447414 kubelet[3188]: I0424 23:59:17.446862 3188 topology_manager.go:143] "Creating topology manager with none policy" Apr 24 23:59:17.447414 kubelet[3188]: I0424 23:59:17.446877 3188 container_manager_linux.go:308] "Creating device plugin manager" Apr 24 23:59:17.447414 kubelet[3188]: I0424 23:59:17.446911 3188 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Apr 24 23:59:17.447414 kubelet[3188]: I0424 23:59:17.447184 3188 state_mem.go:41] "Initialized" logger="CPUManager state memory" Apr 24 23:59:17.449629 kubelet[3188]: I0424 23:59:17.449596 3188 kubelet.go:482] "Attempting to sync node with API server" Apr 24 23:59:17.449629 kubelet[3188]: I0424 23:59:17.449628 3188 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:59:17.449760 kubelet[3188]: I0424 23:59:17.449654 3188 kubelet.go:394] "Adding apiserver pod source" Apr 24 23:59:17.449760 kubelet[3188]: I0424 23:59:17.449668 3188 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:59:17.461989 kubelet[3188]: I0424 23:59:17.461954 3188 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 24 23:59:17.468482 kubelet[3188]: I0424 23:59:17.468216 3188 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:59:17.468482 kubelet[3188]: I0424 23:59:17.468295 3188 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 24 23:59:17.473775 kubelet[3188]: I0424 23:59:17.473752 3188 server.go:1257] "Started kubelet" Apr 24 23:59:17.476219 kubelet[3188]: I0424 23:59:17.476060 3188 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Apr 24 23:59:17.477387 kubelet[3188]: I0424 23:59:17.477330 3188 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:59:17.479477 kubelet[3188]: I0424 23:59:17.478528 3188 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:59:17.486921 kubelet[3188]: I0424 23:59:17.486852 3188 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:59:17.487054 kubelet[3188]: I0424 23:59:17.486943 3188 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 24 23:59:17.487144 kubelet[3188]: I0424 23:59:17.487124 3188 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:59:17.487695 kubelet[3188]: I0424 23:59:17.487661 3188 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 24 23:59:17.492628 kubelet[3188]: I0424 23:59:17.491725 3188 volume_manager.go:311] "Starting Kubelet Volume Manager" Apr 24 23:59:17.494858 kubelet[3188]: I0424 23:59:17.494712 3188 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 24 23:59:17.495299 kubelet[3188]: I0424 23:59:17.495286 3188 reconciler.go:29] "Reconciler: start to sync state" Apr 24 23:59:17.503301 kubelet[3188]: I0424 23:59:17.503267 3188 factory.go:223] Registration of the containerd container factory successfully Apr 24 23:59:17.503755 kubelet[3188]: I0424 23:59:17.503475 3188 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:59:17.503755 kubelet[3188]: I0424 23:59:17.503595 3188 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 24 23:59:17.527870 kubelet[3188]: I0424 23:59:17.527827 3188 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 24 23:59:17.530293 kubelet[3188]: I0424 23:59:17.529911 3188 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 24 23:59:17.530293 kubelet[3188]: I0424 23:59:17.529938 3188 status_manager.go:249] "Starting to sync pod status with apiserver" Apr 24 23:59:17.530293 kubelet[3188]: I0424 23:59:17.529966 3188 kubelet.go:2501] "Starting kubelet main sync loop" Apr 24 23:59:17.530293 kubelet[3188]: E0424 23:59:17.530029 3188 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 24 23:59:17.559651 kubelet[3188]: I0424 23:59:17.559607 3188 cpu_manager.go:225] "Starting" policy="none" Apr 24 23:59:17.559651 kubelet[3188]: I0424 23:59:17.559624 3188 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Apr 24 23:59:17.559651 kubelet[3188]: I0424 23:59:17.559648 3188 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Apr 24 23:59:17.561816 kubelet[3188]: I0424 23:59:17.560197 3188 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Apr 24 23:59:17.561816 kubelet[3188]: I0424 23:59:17.560219 3188 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Apr 24 23:59:17.561816 kubelet[3188]: I0424 23:59:17.560241 3188 policy_none.go:50] "Start" Apr 24 23:59:17.561816 kubelet[3188]: I0424 23:59:17.560252 3188 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 24 23:59:17.561816 kubelet[3188]: I0424 23:59:17.560266 3188 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 24 23:59:17.562573 kubelet[3188]: I0424 23:59:17.562549 3188 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Apr 24 23:59:17.562634 kubelet[3188]: I0424 23:59:17.562581 3188 policy_none.go:44] "Start" Apr 24 23:59:17.571845 kubelet[3188]: E0424 23:59:17.569679 3188 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:59:17.571845 kubelet[3188]: I0424 23:59:17.569926 3188 eviction_manager.go:194] "Eviction manager: starting control loop" Apr 24 23:59:17.571845 kubelet[3188]: I0424 23:59:17.569945 3188 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:59:17.571845 kubelet[3188]: I0424 23:59:17.570404 3188 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Apr 24 23:59:17.576086 kubelet[3188]: E0424 23:59:17.575643 3188 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 24 23:59:17.631945 kubelet[3188]: I0424 23:59:17.631101 3188 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ip-172-31-31-110" Apr 24 23:59:17.631945 kubelet[3188]: I0424 23:59:17.631863 3188 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-31-110" Apr 24 23:59:17.633675 kubelet[3188]: I0424 23:59:17.633027 3188 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:17.643802 kubelet[3188]: E0424 23:59:17.643752 3188 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-31-110\" already exists" pod="kube-system/kube-apiserver-ip-172-31-31-110" Apr 24 23:59:17.644013 kubelet[3188]: E0424 23:59:17.643986 3188 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ip-172-31-31-110\" already exists" pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:17.644636 kubelet[3188]: E0424 23:59:17.644501 3188 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ip-172-31-31-110\" already exists" pod="kube-system/kube-scheduler-ip-172-31-31-110" Apr 24 23:59:17.682462 kubelet[3188]: I0424 23:59:17.681704 3188 kubelet_node_status.go:74] "Attempting to register node" node="ip-172-31-31-110" Apr 24 23:59:17.690452 kubelet[3188]: I0424 23:59:17.690412 3188 kubelet_node_status.go:123] "Node was previously registered" node="ip-172-31-31-110" Apr 24 23:59:17.690574 kubelet[3188]: I0424 23:59:17.690560 3188 kubelet_node_status.go:77] "Successfully registered node" node="ip-172-31-31-110" Apr 24 23:59:17.697995 kubelet[3188]: I0424 23:59:17.697949 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5a1c8eb3a314b4cc99733858627d4848-k8s-certs\") pod \"kube-controller-manager-ip-172-31-31-110\" (UID: \"5a1c8eb3a314b4cc99733858627d4848\") " pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:17.697995 kubelet[3188]: I0424 23:59:17.697992 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f42206aab93a5b9440ca1d36d5fc33be-kubeconfig\") pod \"kube-scheduler-ip-172-31-31-110\" (UID: \"f42206aab93a5b9440ca1d36d5fc33be\") " pod="kube-system/kube-scheduler-ip-172-31-31-110" Apr 24 23:59:17.698169 kubelet[3188]: I0424 23:59:17.698036 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d8fac2cc404e1e6c526ab49980b493d0-ca-certs\") pod \"kube-apiserver-ip-172-31-31-110\" (UID: \"d8fac2cc404e1e6c526ab49980b493d0\") " pod="kube-system/kube-apiserver-ip-172-31-31-110" Apr 24 23:59:17.698169 kubelet[3188]: I0424 23:59:17.698063 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5a1c8eb3a314b4cc99733858627d4848-ca-certs\") pod \"kube-controller-manager-ip-172-31-31-110\" (UID: \"5a1c8eb3a314b4cc99733858627d4848\") " pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:17.698169 kubelet[3188]: I0424 23:59:17.698098 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5a1c8eb3a314b4cc99733858627d4848-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-31-110\" (UID: \"5a1c8eb3a314b4cc99733858627d4848\") " pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:17.698169 kubelet[3188]: I0424 23:59:17.698135 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5a1c8eb3a314b4cc99733858627d4848-kubeconfig\") pod \"kube-controller-manager-ip-172-31-31-110\" (UID: \"5a1c8eb3a314b4cc99733858627d4848\") " pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:17.698373 kubelet[3188]: I0424 23:59:17.698179 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5a1c8eb3a314b4cc99733858627d4848-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-31-110\" (UID: \"5a1c8eb3a314b4cc99733858627d4848\") " pod="kube-system/kube-controller-manager-ip-172-31-31-110" Apr 24 23:59:17.698373 kubelet[3188]: I0424 23:59:17.698203 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d8fac2cc404e1e6c526ab49980b493d0-k8s-certs\") pod \"kube-apiserver-ip-172-31-31-110\" (UID: \"d8fac2cc404e1e6c526ab49980b493d0\") " pod="kube-system/kube-apiserver-ip-172-31-31-110" Apr 24 23:59:17.698373 kubelet[3188]: I0424 23:59:17.698227 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d8fac2cc404e1e6c526ab49980b493d0-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-31-110\" (UID: \"d8fac2cc404e1e6c526ab49980b493d0\") " pod="kube-system/kube-apiserver-ip-172-31-31-110" Apr 24 23:59:18.455247 kubelet[3188]: I0424 23:59:18.455189 3188 apiserver.go:52] "Watching apiserver" Apr 24 23:59:18.495395 kubelet[3188]: I0424 23:59:18.495311 3188 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 24 23:59:18.554205 kubelet[3188]: I0424 23:59:18.553452 3188 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ip-172-31-31-110" Apr 24 23:59:18.556524 kubelet[3188]: I0424 23:59:18.556075 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-31-110" podStartSLOduration=3.5560590530000002 podStartE2EDuration="3.556059053s" podCreationTimestamp="2026-04-24 23:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:59:18.552412196 +0000 UTC m=+1.221540938" watchObservedRunningTime="2026-04-24 23:59:18.556059053 +0000 UTC m=+1.225187794" Apr 24 23:59:18.568893 kubelet[3188]: E0424 23:59:18.568518 3188 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ip-172-31-31-110\" already exists" pod="kube-system/kube-apiserver-ip-172-31-31-110" Apr 24 23:59:18.594264 kubelet[3188]: I0424 23:59:18.593289 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-31-110" podStartSLOduration=3.593271552 podStartE2EDuration="3.593271552s" podCreationTimestamp="2026-04-24 23:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:59:18.577714634 +0000 UTC m=+1.246843376" watchObservedRunningTime="2026-04-24 23:59:18.593271552 +0000 UTC m=+1.262400288" Apr 24 23:59:18.594264 kubelet[3188]: I0424 23:59:18.593416 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-31-110" podStartSLOduration=3.593409736 podStartE2EDuration="3.593409736s" podCreationTimestamp="2026-04-24 23:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:59:18.593021923 +0000 UTC m=+1.262150664" watchObservedRunningTime="2026-04-24 23:59:18.593409736 +0000 UTC m=+1.262538477" Apr 24 23:59:22.051329 kubelet[3188]: I0424 23:59:22.051296 3188 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Apr 24 23:59:22.051925 containerd[1974]: time="2026-04-24T23:59:22.051759852Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Apr 24 23:59:22.052366 kubelet[3188]: I0424 23:59:22.052052 3188 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Apr 24 23:59:22.378441 systemd[1]: Created slice kubepods-besteffort-pod97ae202f_c560_400f_8a31_bc36888607c7.slice - libcontainer container kubepods-besteffort-pod97ae202f_c560_400f_8a31_bc36888607c7.slice. Apr 24 23:59:22.432665 kubelet[3188]: I0424 23:59:22.432153 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/97ae202f-c560-400f-8a31-bc36888607c7-kube-proxy\") pod \"kube-proxy-4knbv\" (UID: \"97ae202f-c560-400f-8a31-bc36888607c7\") " pod="kube-system/kube-proxy-4knbv" Apr 24 23:59:22.432855 kubelet[3188]: I0424 23:59:22.432700 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/97ae202f-c560-400f-8a31-bc36888607c7-xtables-lock\") pod \"kube-proxy-4knbv\" (UID: \"97ae202f-c560-400f-8a31-bc36888607c7\") " pod="kube-system/kube-proxy-4knbv" Apr 24 23:59:22.432855 kubelet[3188]: I0424 23:59:22.432738 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/97ae202f-c560-400f-8a31-bc36888607c7-lib-modules\") pod \"kube-proxy-4knbv\" (UID: \"97ae202f-c560-400f-8a31-bc36888607c7\") " pod="kube-system/kube-proxy-4knbv" Apr 24 23:59:22.432855 kubelet[3188]: I0424 23:59:22.432760 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwz52\" (UniqueName: \"kubernetes.io/projected/97ae202f-c560-400f-8a31-bc36888607c7-kube-api-access-rwz52\") pod \"kube-proxy-4knbv\" (UID: \"97ae202f-c560-400f-8a31-bc36888607c7\") " pod="kube-system/kube-proxy-4knbv" Apr 24 23:59:22.546015 kubelet[3188]: E0424 23:59:22.545962 3188 projected.go:291] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Apr 24 23:59:22.546015 kubelet[3188]: E0424 23:59:22.546019 3188 projected.go:196] Error preparing data for projected volume kube-api-access-rwz52 for pod kube-system/kube-proxy-4knbv: configmap "kube-root-ca.crt" not found Apr 24 23:59:22.546292 kubelet[3188]: E0424 23:59:22.546116 3188 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97ae202f-c560-400f-8a31-bc36888607c7-kube-api-access-rwz52 podName:97ae202f-c560-400f-8a31-bc36888607c7 nodeName:}" failed. No retries permitted until 2026-04-24 23:59:23.046086968 +0000 UTC m=+5.715215705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rwz52" (UniqueName: "kubernetes.io/projected/97ae202f-c560-400f-8a31-bc36888607c7-kube-api-access-rwz52") pod "kube-proxy-4knbv" (UID: "97ae202f-c560-400f-8a31-bc36888607c7") : configmap "kube-root-ca.crt" not found Apr 24 23:59:23.254178 systemd[1]: Created slice kubepods-besteffort-pod14c44779_9834_47c6_838b_cb506af23877.slice - libcontainer container kubepods-besteffort-pod14c44779_9834_47c6_838b_cb506af23877.slice. Apr 24 23:59:23.291241 containerd[1974]: time="2026-04-24T23:59:23.291195829Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4knbv,Uid:97ae202f-c560-400f-8a31-bc36888607c7,Namespace:kube-system,Attempt:0,}" Apr 24 23:59:23.320853 containerd[1974]: time="2026-04-24T23:59:23.320711985Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:59:23.321064 containerd[1974]: time="2026-04-24T23:59:23.320870032Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:59:23.321064 containerd[1974]: time="2026-04-24T23:59:23.320913102Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:23.321180 containerd[1974]: time="2026-04-24T23:59:23.321060754Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:23.340882 kubelet[3188]: I0424 23:59:23.340016 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmw5n\" (UniqueName: \"kubernetes.io/projected/14c44779-9834-47c6-838b-cb506af23877-kube-api-access-rmw5n\") pod \"tigera-operator-6cf4cccc57-kpv67\" (UID: \"14c44779-9834-47c6-838b-cb506af23877\") " pod="tigera-operator/tigera-operator-6cf4cccc57-kpv67" Apr 24 23:59:23.340882 kubelet[3188]: I0424 23:59:23.340065 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/14c44779-9834-47c6-838b-cb506af23877-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-kpv67\" (UID: \"14c44779-9834-47c6-838b-cb506af23877\") " pod="tigera-operator/tigera-operator-6cf4cccc57-kpv67" Apr 24 23:59:23.350046 systemd[1]: Started cri-containerd-a4d0a7614935e81bc70eac77d5d1f4d2a7402711c3667742a57d222407c1ae2f.scope - libcontainer container a4d0a7614935e81bc70eac77d5d1f4d2a7402711c3667742a57d222407c1ae2f. Apr 24 23:59:23.374603 containerd[1974]: time="2026-04-24T23:59:23.374554600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4knbv,Uid:97ae202f-c560-400f-8a31-bc36888607c7,Namespace:kube-system,Attempt:0,} returns sandbox id \"a4d0a7614935e81bc70eac77d5d1f4d2a7402711c3667742a57d222407c1ae2f\"" Apr 24 23:59:23.382212 containerd[1974]: time="2026-04-24T23:59:23.382051607Z" level=info msg="CreateContainer within sandbox \"a4d0a7614935e81bc70eac77d5d1f4d2a7402711c3667742a57d222407c1ae2f\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Apr 24 23:59:23.407288 containerd[1974]: time="2026-04-24T23:59:23.407230671Z" level=info msg="CreateContainer within sandbox \"a4d0a7614935e81bc70eac77d5d1f4d2a7402711c3667742a57d222407c1ae2f\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"313e404ae6d2f9fc752e8eae458a2bb9ac3ad6c9aa18264b57973ff56a9452a1\"" Apr 24 23:59:23.409378 containerd[1974]: time="2026-04-24T23:59:23.408140717Z" level=info msg="StartContainer for \"313e404ae6d2f9fc752e8eae458a2bb9ac3ad6c9aa18264b57973ff56a9452a1\"" Apr 24 23:59:23.438995 systemd[1]: Started cri-containerd-313e404ae6d2f9fc752e8eae458a2bb9ac3ad6c9aa18264b57973ff56a9452a1.scope - libcontainer container 313e404ae6d2f9fc752e8eae458a2bb9ac3ad6c9aa18264b57973ff56a9452a1. Apr 24 23:59:23.477836 containerd[1974]: time="2026-04-24T23:59:23.477703420Z" level=info msg="StartContainer for \"313e404ae6d2f9fc752e8eae458a2bb9ac3ad6c9aa18264b57973ff56a9452a1\" returns successfully" Apr 24 23:59:23.563204 containerd[1974]: time="2026-04-24T23:59:23.562578179Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-kpv67,Uid:14c44779-9834-47c6-838b-cb506af23877,Namespace:tigera-operator,Attempt:0,}" Apr 24 23:59:23.607535 containerd[1974]: time="2026-04-24T23:59:23.607324280Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:59:23.607535 containerd[1974]: time="2026-04-24T23:59:23.607370150Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:59:23.607535 containerd[1974]: time="2026-04-24T23:59:23.607380832Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:23.607535 containerd[1974]: time="2026-04-24T23:59:23.607452713Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:23.629061 systemd[1]: Started cri-containerd-f8e424934444519693af946c7b498f7378a7c57527eea5f714afb93db5a846fc.scope - libcontainer container f8e424934444519693af946c7b498f7378a7c57527eea5f714afb93db5a846fc. Apr 24 23:59:23.679034 containerd[1974]: time="2026-04-24T23:59:23.678995387Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-kpv67,Uid:14c44779-9834-47c6-838b-cb506af23877,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f8e424934444519693af946c7b498f7378a7c57527eea5f714afb93db5a846fc\"" Apr 24 23:59:23.682193 containerd[1974]: time="2026-04-24T23:59:23.682135095Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Apr 24 23:59:24.047987 update_engine[1956]: I20260424 23:59:24.047896 1956 update_attempter.cc:509] Updating boot flags... Apr 24 23:59:24.097826 kernel: BTRFS warning: duplicate device /dev/nvme0n1p3 devid 1 generation 31 scanned by (udev-worker) (3498) Apr 24 23:59:24.163696 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3430936821.mount: Deactivated successfully. Apr 24 23:59:24.895145 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2694813886.mount: Deactivated successfully. Apr 24 23:59:26.424053 containerd[1974]: time="2026-04-24T23:59:26.423998353Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:26.425328 containerd[1974]: time="2026-04-24T23:59:26.425263083Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Apr 24 23:59:26.426822 containerd[1974]: time="2026-04-24T23:59:26.426750174Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:26.429687 containerd[1974]: time="2026-04-24T23:59:26.429648152Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:26.431041 containerd[1974]: time="2026-04-24T23:59:26.430412832Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.748228616s" Apr 24 23:59:26.431041 containerd[1974]: time="2026-04-24T23:59:26.430454409Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Apr 24 23:59:26.441077 containerd[1974]: time="2026-04-24T23:59:26.440578935Z" level=info msg="CreateContainer within sandbox \"f8e424934444519693af946c7b498f7378a7c57527eea5f714afb93db5a846fc\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Apr 24 23:59:26.461854 containerd[1974]: time="2026-04-24T23:59:26.461812792Z" level=info msg="CreateContainer within sandbox \"f8e424934444519693af946c7b498f7378a7c57527eea5f714afb93db5a846fc\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6\"" Apr 24 23:59:26.462717 containerd[1974]: time="2026-04-24T23:59:26.462676769Z" level=info msg="StartContainer for \"5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6\"" Apr 24 23:59:26.500878 systemd[1]: run-containerd-runc-k8s.io-5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6-runc.Fm552g.mount: Deactivated successfully. Apr 24 23:59:26.509067 systemd[1]: Started cri-containerd-5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6.scope - libcontainer container 5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6. Apr 24 23:59:26.545728 containerd[1974]: time="2026-04-24T23:59:26.545420057Z" level=info msg="StartContainer for \"5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6\" returns successfully" Apr 24 23:59:26.592758 kubelet[3188]: I0424 23:59:26.592687 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-4knbv" podStartSLOduration=4.592631578 podStartE2EDuration="4.592631578s" podCreationTimestamp="2026-04-24 23:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:59:23.583876784 +0000 UTC m=+6.253005535" watchObservedRunningTime="2026-04-24 23:59:26.592631578 +0000 UTC m=+9.261760319" Apr 24 23:59:27.805927 kubelet[3188]: I0424 23:59:27.805862 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-kpv67" podStartSLOduration=2.052969248 podStartE2EDuration="4.805848198s" podCreationTimestamp="2026-04-24 23:59:23 +0000 UTC" firstStartedPulling="2026-04-24 23:59:23.681381534 +0000 UTC m=+6.350510260" lastFinishedPulling="2026-04-24 23:59:26.43426049 +0000 UTC m=+9.103389210" observedRunningTime="2026-04-24 23:59:26.59461775 +0000 UTC m=+9.263746492" watchObservedRunningTime="2026-04-24 23:59:27.805848198 +0000 UTC m=+10.474976939" Apr 24 23:59:33.712924 sudo[2309]: pam_unix(sudo:session): session closed for user root Apr 24 23:59:33.876468 sshd[2306]: pam_unix(sshd:session): session closed for user core Apr 24 23:59:33.882028 systemd-logind[1954]: Session 7 logged out. Waiting for processes to exit. Apr 24 23:59:33.885154 systemd[1]: sshd@6-172.31.31.110:22-4.175.71.9:41602.service: Deactivated successfully. Apr 24 23:59:33.891263 systemd[1]: session-7.scope: Deactivated successfully. Apr 24 23:59:33.892892 systemd[1]: session-7.scope: Consumed 4.091s CPU time, 154.6M memory peak, 0B memory swap peak. Apr 24 23:59:33.896118 systemd-logind[1954]: Removed session 7. Apr 24 23:59:35.136731 systemd[1]: Created slice kubepods-besteffort-podcc3f2b23_8d92_48da_9f0a_fa4c65caaac6.slice - libcontainer container kubepods-besteffort-podcc3f2b23_8d92_48da_9f0a_fa4c65caaac6.slice. Apr 24 23:59:35.225764 kubelet[3188]: I0424 23:59:35.225717 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc3f2b23-8d92-48da-9f0a-fa4c65caaac6-tigera-ca-bundle\") pod \"calico-typha-6cfc9f9dbb-kqn49\" (UID: \"cc3f2b23-8d92-48da-9f0a-fa4c65caaac6\") " pod="calico-system/calico-typha-6cfc9f9dbb-kqn49" Apr 24 23:59:35.226899 kubelet[3188]: I0424 23:59:35.226707 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wht2g\" (UniqueName: \"kubernetes.io/projected/cc3f2b23-8d92-48da-9f0a-fa4c65caaac6-kube-api-access-wht2g\") pod \"calico-typha-6cfc9f9dbb-kqn49\" (UID: \"cc3f2b23-8d92-48da-9f0a-fa4c65caaac6\") " pod="calico-system/calico-typha-6cfc9f9dbb-kqn49" Apr 24 23:59:35.226899 kubelet[3188]: I0424 23:59:35.226755 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/cc3f2b23-8d92-48da-9f0a-fa4c65caaac6-typha-certs\") pod \"calico-typha-6cfc9f9dbb-kqn49\" (UID: \"cc3f2b23-8d92-48da-9f0a-fa4c65caaac6\") " pod="calico-system/calico-typha-6cfc9f9dbb-kqn49" Apr 24 23:59:35.239836 systemd[1]: Created slice kubepods-besteffort-pod37c61967_22ba_4457_998d_9a458b1a6e40.slice - libcontainer container kubepods-besteffort-pod37c61967_22ba_4457_998d_9a458b1a6e40.slice. Apr 24 23:59:35.329589 kubelet[3188]: I0424 23:59:35.328511 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-policysync\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.329589 kubelet[3188]: I0424 23:59:35.328624 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-xtables-lock\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.329589 kubelet[3188]: I0424 23:59:35.328648 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-cni-bin-dir\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.329589 kubelet[3188]: I0424 23:59:35.328667 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-cni-log-dir\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.329589 kubelet[3188]: I0424 23:59:35.328696 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-lib-modules\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.330238 kubelet[3188]: I0424 23:59:35.328717 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-sys-fs\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.330238 kubelet[3188]: I0424 23:59:35.328753 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-bpffs\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.330238 kubelet[3188]: I0424 23:59:35.328810 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/37c61967-22ba-4457-998d-9a458b1a6e40-node-certs\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.330238 kubelet[3188]: I0424 23:59:35.328835 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-cni-net-dir\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.330238 kubelet[3188]: I0424 23:59:35.328856 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37c61967-22ba-4457-998d-9a458b1a6e40-tigera-ca-bundle\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.330464 kubelet[3188]: I0424 23:59:35.328882 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-var-lib-calico\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.330464 kubelet[3188]: I0424 23:59:35.328904 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-var-run-calico\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.330464 kubelet[3188]: I0424 23:59:35.328928 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqds\" (UniqueName: \"kubernetes.io/projected/37c61967-22ba-4457-998d-9a458b1a6e40-kube-api-access-jzqds\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.330464 kubelet[3188]: I0424 23:59:35.328965 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-flexvol-driver-host\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.330464 kubelet[3188]: I0424 23:59:35.328985 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/37c61967-22ba-4457-998d-9a458b1a6e40-nodeproc\") pod \"calico-node-hfhdv\" (UID: \"37c61967-22ba-4457-998d-9a458b1a6e40\") " pod="calico-system/calico-node-hfhdv" Apr 24 23:59:35.435455 kubelet[3188]: E0424 23:59:35.435421 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.436095 kubelet[3188]: W0424 23:59:35.435740 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.436095 kubelet[3188]: E0424 23:59:35.435776 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.436838 kubelet[3188]: E0424 23:59:35.436612 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.436838 kubelet[3188]: W0424 23:59:35.436631 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.436838 kubelet[3188]: E0424 23:59:35.436649 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.438876 kubelet[3188]: E0424 23:59:35.437420 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.438876 kubelet[3188]: W0424 23:59:35.437436 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.438876 kubelet[3188]: E0424 23:59:35.437453 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.440835 kubelet[3188]: E0424 23:59:35.439896 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.440835 kubelet[3188]: W0424 23:59:35.439913 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.440835 kubelet[3188]: E0424 23:59:35.439930 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.444055 kubelet[3188]: E0424 23:59:35.441174 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.444055 kubelet[3188]: W0424 23:59:35.441190 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.444055 kubelet[3188]: E0424 23:59:35.441207 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.444055 kubelet[3188]: E0424 23:59:35.441461 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.444055 kubelet[3188]: W0424 23:59:35.441472 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.444055 kubelet[3188]: E0424 23:59:35.441485 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.444055 kubelet[3188]: E0424 23:59:35.442970 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.444055 kubelet[3188]: W0424 23:59:35.442985 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.444055 kubelet[3188]: E0424 23:59:35.442999 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.444055 kubelet[3188]: E0424 23:59:35.443223 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.444585 kubelet[3188]: W0424 23:59:35.443233 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.444585 kubelet[3188]: E0424 23:59:35.443245 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.446065 kubelet[3188]: E0424 23:59:35.445367 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.446065 kubelet[3188]: W0424 23:59:35.445383 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.446065 kubelet[3188]: E0424 23:59:35.445399 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.450726 containerd[1974]: time="2026-04-24T23:59:35.450572948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6cfc9f9dbb-kqn49,Uid:cc3f2b23-8d92-48da-9f0a-fa4c65caaac6,Namespace:calico-system,Attempt:0,}" Apr 24 23:59:35.471620 kubelet[3188]: E0424 23:59:35.471499 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.471620 kubelet[3188]: W0424 23:59:35.471529 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.471620 kubelet[3188]: E0424 23:59:35.471572 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.483830 kubelet[3188]: E0424 23:59:35.481963 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:35.518492 kubelet[3188]: E0424 23:59:35.518430 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.518492 kubelet[3188]: W0424 23:59:35.518458 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.518492 kubelet[3188]: E0424 23:59:35.518485 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.521835 kubelet[3188]: E0424 23:59:35.520190 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.521835 kubelet[3188]: W0424 23:59:35.520209 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.521835 kubelet[3188]: E0424 23:59:35.520325 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.521835 kubelet[3188]: E0424 23:59:35.521672 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.521835 kubelet[3188]: W0424 23:59:35.521685 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.521835 kubelet[3188]: E0424 23:59:35.521704 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.522987 kubelet[3188]: E0424 23:59:35.522877 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.522987 kubelet[3188]: W0424 23:59:35.522891 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.522987 kubelet[3188]: E0424 23:59:35.522908 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.524721 kubelet[3188]: E0424 23:59:35.523953 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.524721 kubelet[3188]: W0424 23:59:35.523965 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.524721 kubelet[3188]: E0424 23:59:35.523982 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.524721 kubelet[3188]: E0424 23:59:35.524558 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.524721 kubelet[3188]: W0424 23:59:35.524570 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.524721 kubelet[3188]: E0424 23:59:35.524585 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.525850 kubelet[3188]: E0424 23:59:35.525163 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.525850 kubelet[3188]: W0424 23:59:35.525176 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.525850 kubelet[3188]: E0424 23:59:35.525191 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.526033 kubelet[3188]: E0424 23:59:35.526007 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.526033 kubelet[3188]: W0424 23:59:35.526019 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.526126 kubelet[3188]: E0424 23:59:35.526034 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.528823 kubelet[3188]: E0424 23:59:35.527018 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.528823 kubelet[3188]: W0424 23:59:35.527033 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.528823 kubelet[3188]: E0424 23:59:35.527048 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.528823 kubelet[3188]: E0424 23:59:35.527277 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.528823 kubelet[3188]: W0424 23:59:35.527285 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.528823 kubelet[3188]: E0424 23:59:35.527296 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.528823 kubelet[3188]: E0424 23:59:35.527493 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.528823 kubelet[3188]: W0424 23:59:35.527501 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.528823 kubelet[3188]: E0424 23:59:35.527511 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.535409 kubelet[3188]: E0424 23:59:35.535360 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.535827 kubelet[3188]: W0424 23:59:35.535622 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.535827 kubelet[3188]: E0424 23:59:35.535657 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.537029 kubelet[3188]: E0424 23:59:35.537003 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.537029 kubelet[3188]: W0424 23:59:35.537026 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.537163 kubelet[3188]: E0424 23:59:35.537049 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.537897 kubelet[3188]: E0424 23:59:35.537871 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.537897 kubelet[3188]: W0424 23:59:35.537893 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.538022 kubelet[3188]: E0424 23:59:35.537911 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.542712 kubelet[3188]: E0424 23:59:35.542679 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.542712 kubelet[3188]: W0424 23:59:35.542707 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.543004 kubelet[3188]: E0424 23:59:35.542731 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.543457 kubelet[3188]: E0424 23:59:35.543434 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.543528 kubelet[3188]: W0424 23:59:35.543457 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.543528 kubelet[3188]: E0424 23:59:35.543477 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.545951 kubelet[3188]: E0424 23:59:35.545818 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.545951 kubelet[3188]: W0424 23:59:35.545837 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.545951 kubelet[3188]: E0424 23:59:35.545866 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.547995 kubelet[3188]: E0424 23:59:35.547447 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.547995 kubelet[3188]: W0424 23:59:35.547460 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.547995 kubelet[3188]: E0424 23:59:35.547477 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.547995 kubelet[3188]: E0424 23:59:35.547759 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.547995 kubelet[3188]: W0424 23:59:35.547772 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.547995 kubelet[3188]: E0424 23:59:35.547802 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.548736 kubelet[3188]: E0424 23:59:35.548688 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.548736 kubelet[3188]: W0424 23:59:35.548703 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.552833 kubelet[3188]: E0424 23:59:35.548717 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.553650 kubelet[3188]: E0424 23:59:35.553591 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.553650 kubelet[3188]: W0424 23:59:35.553613 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.553650 kubelet[3188]: E0424 23:59:35.553635 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.553942 kubelet[3188]: I0424 23:59:35.553665 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fafabee1-df27-491d-a48c-611faa0cd932-kubelet-dir\") pod \"csi-node-driver-wllpb\" (UID: \"fafabee1-df27-491d-a48c-611faa0cd932\") " pod="calico-system/csi-node-driver-wllpb" Apr 24 23:59:35.554847 kubelet[3188]: E0424 23:59:35.554718 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.554847 kubelet[3188]: W0424 23:59:35.554736 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.554847 kubelet[3188]: E0424 23:59:35.554752 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.554847 kubelet[3188]: I0424 23:59:35.554777 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fafabee1-df27-491d-a48c-611faa0cd932-registration-dir\") pod \"csi-node-driver-wllpb\" (UID: \"fafabee1-df27-491d-a48c-611faa0cd932\") " pod="calico-system/csi-node-driver-wllpb" Apr 24 23:59:35.555074 kubelet[3188]: E0424 23:59:35.555033 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.555074 kubelet[3188]: W0424 23:59:35.555045 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.555074 kubelet[3188]: E0424 23:59:35.555059 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.555202 kubelet[3188]: I0424 23:59:35.555079 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/fafabee1-df27-491d-a48c-611faa0cd932-varrun\") pod \"csi-node-driver-wllpb\" (UID: \"fafabee1-df27-491d-a48c-611faa0cd932\") " pod="calico-system/csi-node-driver-wllpb" Apr 24 23:59:35.555773 kubelet[3188]: E0424 23:59:35.555750 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.555773 kubelet[3188]: W0424 23:59:35.555773 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.555908 kubelet[3188]: E0424 23:59:35.555804 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.555908 kubelet[3188]: I0424 23:59:35.555828 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fafabee1-df27-491d-a48c-611faa0cd932-socket-dir\") pod \"csi-node-driver-wllpb\" (UID: \"fafabee1-df27-491d-a48c-611faa0cd932\") " pod="calico-system/csi-node-driver-wllpb" Apr 24 23:59:35.556859 kubelet[3188]: E0424 23:59:35.556612 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.556859 kubelet[3188]: W0424 23:59:35.556632 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.556859 kubelet[3188]: E0424 23:59:35.556647 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.556859 kubelet[3188]: I0424 23:59:35.556669 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8ggx\" (UniqueName: \"kubernetes.io/projected/fafabee1-df27-491d-a48c-611faa0cd932-kube-api-access-g8ggx\") pod \"csi-node-driver-wllpb\" (UID: \"fafabee1-df27-491d-a48c-611faa0cd932\") " pod="calico-system/csi-node-driver-wllpb" Apr 24 23:59:35.557666 kubelet[3188]: E0424 23:59:35.557638 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.557666 kubelet[3188]: W0424 23:59:35.557652 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.557795 kubelet[3188]: E0424 23:59:35.557667 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.560119 kubelet[3188]: E0424 23:59:35.560097 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.560119 kubelet[3188]: W0424 23:59:35.560119 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.560990 kubelet[3188]: E0424 23:59:35.560965 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.561754 kubelet[3188]: E0424 23:59:35.561738 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.561754 kubelet[3188]: W0424 23:59:35.561754 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.561875 kubelet[3188]: E0424 23:59:35.561771 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.562739 containerd[1974]: time="2026-04-24T23:59:35.562700196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hfhdv,Uid:37c61967-22ba-4457-998d-9a458b1a6e40,Namespace:calico-system,Attempt:0,}" Apr 24 23:59:35.564065 kubelet[3188]: E0424 23:59:35.564038 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.564065 kubelet[3188]: W0424 23:59:35.564058 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.564196 kubelet[3188]: E0424 23:59:35.564082 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.565042 kubelet[3188]: E0424 23:59:35.565023 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.565153 kubelet[3188]: W0424 23:59:35.565042 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.565153 kubelet[3188]: E0424 23:59:35.565062 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.566351 kubelet[3188]: E0424 23:59:35.566332 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.566444 kubelet[3188]: W0424 23:59:35.566351 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.566444 kubelet[3188]: E0424 23:59:35.566368 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.568438 kubelet[3188]: E0424 23:59:35.568414 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.568438 kubelet[3188]: W0424 23:59:35.568435 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.568619 kubelet[3188]: E0424 23:59:35.568454 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.569917 kubelet[3188]: E0424 23:59:35.569869 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.570135 kubelet[3188]: W0424 23:59:35.570111 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.570226 kubelet[3188]: E0424 23:59:35.570141 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.572150 containerd[1974]: time="2026-04-24T23:59:35.569737552Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:59:35.572150 containerd[1974]: time="2026-04-24T23:59:35.570694444Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:59:35.572150 containerd[1974]: time="2026-04-24T23:59:35.570724214Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:35.572150 containerd[1974]: time="2026-04-24T23:59:35.570890259Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:35.572425 kubelet[3188]: E0424 23:59:35.571968 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.572425 kubelet[3188]: W0424 23:59:35.571982 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.572425 kubelet[3188]: E0424 23:59:35.572002 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.572425 kubelet[3188]: E0424 23:59:35.572300 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.572425 kubelet[3188]: W0424 23:59:35.572323 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.572425 kubelet[3188]: E0424 23:59:35.572339 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.625030 systemd[1]: Started cri-containerd-00da12dfe3b9aa354b7ee52e17b1b75c198f5f993c7b91895dc7726134085340.scope - libcontainer container 00da12dfe3b9aa354b7ee52e17b1b75c198f5f993c7b91895dc7726134085340. Apr 24 23:59:35.658515 kubelet[3188]: E0424 23:59:35.658368 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.658515 kubelet[3188]: W0424 23:59:35.658515 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.658712 kubelet[3188]: E0424 23:59:35.658542 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.661855 kubelet[3188]: E0424 23:59:35.661385 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.661855 kubelet[3188]: W0424 23:59:35.661419 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.661855 kubelet[3188]: E0424 23:59:35.661449 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.662772 kubelet[3188]: E0424 23:59:35.662658 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.662946 kubelet[3188]: W0424 23:59:35.662779 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.663002 kubelet[3188]: E0424 23:59:35.662955 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.664086 kubelet[3188]: E0424 23:59:35.664058 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.664086 kubelet[3188]: W0424 23:59:35.664075 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.664274 kubelet[3188]: E0424 23:59:35.664114 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.665401 kubelet[3188]: E0424 23:59:35.665370 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.665401 kubelet[3188]: W0424 23:59:35.665384 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.666158 kubelet[3188]: E0424 23:59:35.665402 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.667598 kubelet[3188]: E0424 23:59:35.667511 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.667598 kubelet[3188]: W0424 23:59:35.667528 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.667598 kubelet[3188]: E0424 23:59:35.667543 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.669147 kubelet[3188]: E0424 23:59:35.668822 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.669147 kubelet[3188]: W0424 23:59:35.668838 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.669147 kubelet[3188]: E0424 23:59:35.668854 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.670316 kubelet[3188]: E0424 23:59:35.670105 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.670316 kubelet[3188]: W0424 23:59:35.670120 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.670316 kubelet[3188]: E0424 23:59:35.670136 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.671364 kubelet[3188]: E0424 23:59:35.671165 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.671364 kubelet[3188]: W0424 23:59:35.671181 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.671364 kubelet[3188]: E0424 23:59:35.671196 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.672098 kubelet[3188]: E0424 23:59:35.671891 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.672098 kubelet[3188]: W0424 23:59:35.671905 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.672098 kubelet[3188]: E0424 23:59:35.671919 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.673366 kubelet[3188]: E0424 23:59:35.673150 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.673366 kubelet[3188]: W0424 23:59:35.673165 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.673366 kubelet[3188]: E0424 23:59:35.673183 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.674418 kubelet[3188]: E0424 23:59:35.674036 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.674418 kubelet[3188]: W0424 23:59:35.674050 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.674418 kubelet[3188]: E0424 23:59:35.674065 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.674924 kubelet[3188]: E0424 23:59:35.674721 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.674924 kubelet[3188]: W0424 23:59:35.674735 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.674924 kubelet[3188]: E0424 23:59:35.674749 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.676042 kubelet[3188]: E0424 23:59:35.675641 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.676042 kubelet[3188]: W0424 23:59:35.675656 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.676042 kubelet[3188]: E0424 23:59:35.675679 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.677072 kubelet[3188]: E0424 23:59:35.676854 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.677072 kubelet[3188]: W0424 23:59:35.676869 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.677072 kubelet[3188]: E0424 23:59:35.676883 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.677668 kubelet[3188]: E0424 23:59:35.677363 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.677668 kubelet[3188]: W0424 23:59:35.677377 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.677668 kubelet[3188]: E0424 23:59:35.677390 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.678765 kubelet[3188]: E0424 23:59:35.678147 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.678765 kubelet[3188]: W0424 23:59:35.678182 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.678765 kubelet[3188]: E0424 23:59:35.678199 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.679503 kubelet[3188]: E0424 23:59:35.679250 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.679503 kubelet[3188]: W0424 23:59:35.679265 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.679503 kubelet[3188]: E0424 23:59:35.679279 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.680303 kubelet[3188]: E0424 23:59:35.679975 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.680303 kubelet[3188]: W0424 23:59:35.679988 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.680303 kubelet[3188]: E0424 23:59:35.680002 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.680955 kubelet[3188]: E0424 23:59:35.680842 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.680955 kubelet[3188]: W0424 23:59:35.680856 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.680955 kubelet[3188]: E0424 23:59:35.680869 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.682271 kubelet[3188]: E0424 23:59:35.682035 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.682271 kubelet[3188]: W0424 23:59:35.682049 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.682271 kubelet[3188]: E0424 23:59:35.682062 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.684278 kubelet[3188]: E0424 23:59:35.683031 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.684278 kubelet[3188]: W0424 23:59:35.683045 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.684278 kubelet[3188]: E0424 23:59:35.683064 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.684862 kubelet[3188]: E0424 23:59:35.684846 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.684952 kubelet[3188]: W0424 23:59:35.684938 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.685042 kubelet[3188]: E0424 23:59:35.685029 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.686316 kubelet[3188]: E0424 23:59:35.686239 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.686519 kubelet[3188]: W0424 23:59:35.686503 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.687809 kubelet[3188]: E0424 23:59:35.687589 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.688778 containerd[1974]: time="2026-04-24T23:59:35.688417015Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:59:35.688778 containerd[1974]: time="2026-04-24T23:59:35.688638314Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:59:35.688778 containerd[1974]: time="2026-04-24T23:59:35.688678903Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:35.689358 kubelet[3188]: E0424 23:59:35.688636 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.689358 kubelet[3188]: W0424 23:59:35.688647 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.689358 kubelet[3188]: E0424 23:59:35.688664 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.690987 containerd[1974]: time="2026-04-24T23:59:35.690934126Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:59:35.713905 kubelet[3188]: E0424 23:59:35.713855 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:35.714134 kubelet[3188]: W0424 23:59:35.713884 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:35.714134 kubelet[3188]: E0424 23:59:35.714074 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:35.749016 systemd[1]: Started cri-containerd-e839515a5d527e392b4aa2c3a5cfb951c4e1bcb7ad480c87497cc26dd8464e19.scope - libcontainer container e839515a5d527e392b4aa2c3a5cfb951c4e1bcb7ad480c87497cc26dd8464e19. Apr 24 23:59:35.757005 containerd[1974]: time="2026-04-24T23:59:35.755120815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6cfc9f9dbb-kqn49,Uid:cc3f2b23-8d92-48da-9f0a-fa4c65caaac6,Namespace:calico-system,Attempt:0,} returns sandbox id \"00da12dfe3b9aa354b7ee52e17b1b75c198f5f993c7b91895dc7726134085340\"" Apr 24 23:59:35.762023 containerd[1974]: time="2026-04-24T23:59:35.761984626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Apr 24 23:59:35.818309 containerd[1974]: time="2026-04-24T23:59:35.818265154Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hfhdv,Uid:37c61967-22ba-4457-998d-9a458b1a6e40,Namespace:calico-system,Attempt:0,} returns sandbox id \"e839515a5d527e392b4aa2c3a5cfb951c4e1bcb7ad480c87497cc26dd8464e19\"" Apr 24 23:59:37.295951 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2090428563.mount: Deactivated successfully. Apr 24 23:59:37.532259 kubelet[3188]: E0424 23:59:37.531568 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:37.874849 kubelet[3188]: E0424 23:59:37.874809 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.874849 kubelet[3188]: W0424 23:59:37.874838 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.875153 kubelet[3188]: E0424 23:59:37.874882 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.875248 kubelet[3188]: E0424 23:59:37.875183 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.875248 kubelet[3188]: W0424 23:59:37.875195 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.875248 kubelet[3188]: E0424 23:59:37.875212 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.875705 kubelet[3188]: E0424 23:59:37.875575 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.875705 kubelet[3188]: W0424 23:59:37.875605 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.875705 kubelet[3188]: E0424 23:59:37.875619 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.876025 kubelet[3188]: E0424 23:59:37.875995 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.876025 kubelet[3188]: W0424 23:59:37.876011 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.876025 kubelet[3188]: E0424 23:59:37.876025 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.876341 kubelet[3188]: E0424 23:59:37.876272 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.876341 kubelet[3188]: W0424 23:59:37.876282 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.876341 kubelet[3188]: E0424 23:59:37.876300 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.876547 kubelet[3188]: E0424 23:59:37.876529 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.876547 kubelet[3188]: W0424 23:59:37.876540 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.876661 kubelet[3188]: E0424 23:59:37.876552 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.876889 kubelet[3188]: E0424 23:59:37.876848 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.876889 kubelet[3188]: W0424 23:59:37.876867 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.877064 kubelet[3188]: E0424 23:59:37.876890 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.877144 kubelet[3188]: E0424 23:59:37.877132 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.877221 kubelet[3188]: W0424 23:59:37.877145 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.877221 kubelet[3188]: E0424 23:59:37.877158 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.877404 kubelet[3188]: E0424 23:59:37.877386 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.877404 kubelet[3188]: W0424 23:59:37.877399 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.877545 kubelet[3188]: E0424 23:59:37.877412 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.877651 kubelet[3188]: E0424 23:59:37.877633 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.877651 kubelet[3188]: W0424 23:59:37.877648 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.877810 kubelet[3188]: E0424 23:59:37.877662 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.877915 kubelet[3188]: E0424 23:59:37.877896 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.877915 kubelet[3188]: W0424 23:59:37.877910 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.878034 kubelet[3188]: E0424 23:59:37.877923 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.878230 kubelet[3188]: E0424 23:59:37.878143 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.878230 kubelet[3188]: W0424 23:59:37.878155 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.878230 kubelet[3188]: E0424 23:59:37.878167 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.880162 kubelet[3188]: E0424 23:59:37.880035 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.880162 kubelet[3188]: W0424 23:59:37.880058 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.880162 kubelet[3188]: E0424 23:59:37.880073 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.880857 kubelet[3188]: E0424 23:59:37.880537 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.880857 kubelet[3188]: W0424 23:59:37.880550 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.880857 kubelet[3188]: E0424 23:59:37.880563 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:37.881210 kubelet[3188]: E0424 23:59:37.881101 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:37.881210 kubelet[3188]: W0424 23:59:37.881113 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:37.881210 kubelet[3188]: E0424 23:59:37.881127 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.080027 containerd[1974]: time="2026-04-24T23:59:39.079970094Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:39.081807 containerd[1974]: time="2026-04-24T23:59:39.081596973Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Apr 24 23:59:39.083407 containerd[1974]: time="2026-04-24T23:59:39.083196851Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:39.086180 containerd[1974]: time="2026-04-24T23:59:39.086124634Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:39.087510 containerd[1974]: time="2026-04-24T23:59:39.086806988Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 3.32475352s" Apr 24 23:59:39.087510 containerd[1974]: time="2026-04-24T23:59:39.086847709Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Apr 24 23:59:39.088483 containerd[1974]: time="2026-04-24T23:59:39.088458754Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Apr 24 23:59:39.124202 containerd[1974]: time="2026-04-24T23:59:39.124159031Z" level=info msg="CreateContainer within sandbox \"00da12dfe3b9aa354b7ee52e17b1b75c198f5f993c7b91895dc7726134085340\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Apr 24 23:59:39.153608 containerd[1974]: time="2026-04-24T23:59:39.153451270Z" level=info msg="CreateContainer within sandbox \"00da12dfe3b9aa354b7ee52e17b1b75c198f5f993c7b91895dc7726134085340\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"724a121b7381da6806b8e9484d0c44c579f77d90f1db67b45f7ec3b61a3044f0\"" Apr 24 23:59:39.155007 containerd[1974]: time="2026-04-24T23:59:39.154217432Z" level=info msg="StartContainer for \"724a121b7381da6806b8e9484d0c44c579f77d90f1db67b45f7ec3b61a3044f0\"" Apr 24 23:59:39.207976 systemd[1]: Started cri-containerd-724a121b7381da6806b8e9484d0c44c579f77d90f1db67b45f7ec3b61a3044f0.scope - libcontainer container 724a121b7381da6806b8e9484d0c44c579f77d90f1db67b45f7ec3b61a3044f0. Apr 24 23:59:39.255690 containerd[1974]: time="2026-04-24T23:59:39.255641100Z" level=info msg="StartContainer for \"724a121b7381da6806b8e9484d0c44c579f77d90f1db67b45f7ec3b61a3044f0\" returns successfully" Apr 24 23:59:39.535775 kubelet[3188]: E0424 23:59:39.535408 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:39.682067 kubelet[3188]: I0424 23:59:39.681998 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-6cfc9f9dbb-kqn49" podStartSLOduration=1.3540013659999999 podStartE2EDuration="4.681981037s" podCreationTimestamp="2026-04-24 23:59:35 +0000 UTC" firstStartedPulling="2026-04-24 23:59:35.760309548 +0000 UTC m=+18.429438282" lastFinishedPulling="2026-04-24 23:59:39.08828922 +0000 UTC m=+21.757417953" observedRunningTime="2026-04-24 23:59:39.667716622 +0000 UTC m=+22.336845364" watchObservedRunningTime="2026-04-24 23:59:39.681981037 +0000 UTC m=+22.351109776" Apr 24 23:59:39.693192 kubelet[3188]: E0424 23:59:39.693157 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.693192 kubelet[3188]: W0424 23:59:39.693186 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.693574 kubelet[3188]: E0424 23:59:39.693225 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.693574 kubelet[3188]: E0424 23:59:39.693560 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.693574 kubelet[3188]: W0424 23:59:39.693572 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.693731 kubelet[3188]: E0424 23:59:39.693590 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.694246 kubelet[3188]: E0424 23:59:39.694043 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.694246 kubelet[3188]: W0424 23:59:39.694075 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.694246 kubelet[3188]: E0424 23:59:39.694090 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.695154 kubelet[3188]: E0424 23:59:39.695133 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.695273 kubelet[3188]: W0424 23:59:39.695254 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.695363 kubelet[3188]: E0424 23:59:39.695277 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.695761 kubelet[3188]: E0424 23:59:39.695742 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.695856 kubelet[3188]: W0424 23:59:39.695781 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.695856 kubelet[3188]: E0424 23:59:39.695815 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.696752 kubelet[3188]: E0424 23:59:39.696721 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.696752 kubelet[3188]: W0424 23:59:39.696740 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.696911 kubelet[3188]: E0424 23:59:39.696755 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.697547 kubelet[3188]: E0424 23:59:39.696993 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.697547 kubelet[3188]: W0424 23:59:39.697005 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.697547 kubelet[3188]: E0424 23:59:39.697018 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.697547 kubelet[3188]: E0424 23:59:39.697289 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.697547 kubelet[3188]: W0424 23:59:39.697299 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.697547 kubelet[3188]: E0424 23:59:39.697312 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.698235 kubelet[3188]: E0424 23:59:39.698047 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.698235 kubelet[3188]: W0424 23:59:39.698060 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.698235 kubelet[3188]: E0424 23:59:39.698074 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.698830 kubelet[3188]: E0424 23:59:39.698622 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.698830 kubelet[3188]: W0424 23:59:39.698635 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.698830 kubelet[3188]: E0424 23:59:39.698650 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.699636 kubelet[3188]: E0424 23:59:39.699139 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.699636 kubelet[3188]: W0424 23:59:39.699151 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.699636 kubelet[3188]: E0424 23:59:39.699164 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.699636 kubelet[3188]: E0424 23:59:39.699379 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.699636 kubelet[3188]: W0424 23:59:39.699388 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.699636 kubelet[3188]: E0424 23:59:39.699398 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.699970 kubelet[3188]: E0424 23:59:39.699694 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.699970 kubelet[3188]: W0424 23:59:39.699705 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.699970 kubelet[3188]: E0424 23:59:39.699729 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.700224 kubelet[3188]: E0424 23:59:39.700008 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.700224 kubelet[3188]: W0424 23:59:39.700019 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.700224 kubelet[3188]: E0424 23:59:39.700055 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.700466 kubelet[3188]: E0424 23:59:39.700319 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.700466 kubelet[3188]: W0424 23:59:39.700330 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.700466 kubelet[3188]: E0424 23:59:39.700364 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.700823 kubelet[3188]: E0424 23:59:39.700776 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.700993 kubelet[3188]: W0424 23:59:39.700822 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.700993 kubelet[3188]: E0424 23:59:39.700857 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.701283 kubelet[3188]: E0424 23:59:39.701261 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.701450 kubelet[3188]: W0424 23:59:39.701284 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.701450 kubelet[3188]: E0424 23:59:39.701299 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.701667 kubelet[3188]: E0424 23:59:39.701619 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.701667 kubelet[3188]: W0424 23:59:39.701642 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.701667 kubelet[3188]: E0424 23:59:39.701656 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.702213 kubelet[3188]: E0424 23:59:39.702137 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.702213 kubelet[3188]: W0424 23:59:39.702152 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.702213 kubelet[3188]: E0424 23:59:39.702167 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.703666 kubelet[3188]: E0424 23:59:39.703637 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.703666 kubelet[3188]: W0424 23:59:39.703657 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.703825 kubelet[3188]: E0424 23:59:39.703672 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.704110 kubelet[3188]: E0424 23:59:39.704093 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.704110 kubelet[3188]: W0424 23:59:39.704110 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.704230 kubelet[3188]: E0424 23:59:39.704125 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.706556 kubelet[3188]: E0424 23:59:39.706533 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.706556 kubelet[3188]: W0424 23:59:39.706553 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.706678 kubelet[3188]: E0424 23:59:39.706571 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.707019 kubelet[3188]: E0424 23:59:39.706901 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.707019 kubelet[3188]: W0424 23:59:39.706915 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.707019 kubelet[3188]: E0424 23:59:39.706930 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.707260 kubelet[3188]: E0424 23:59:39.707248 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.707441 kubelet[3188]: W0424 23:59:39.707336 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.707441 kubelet[3188]: E0424 23:59:39.707354 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.708655 kubelet[3188]: E0424 23:59:39.708521 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.708655 kubelet[3188]: W0424 23:59:39.708537 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.708655 kubelet[3188]: E0424 23:59:39.708550 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.709985 kubelet[3188]: E0424 23:59:39.709854 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.709985 kubelet[3188]: W0424 23:59:39.709868 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.709985 kubelet[3188]: E0424 23:59:39.709883 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.711226 kubelet[3188]: E0424 23:59:39.710955 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.711226 kubelet[3188]: W0424 23:59:39.710970 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.711226 kubelet[3188]: E0424 23:59:39.710994 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.713814 kubelet[3188]: E0424 23:59:39.711849 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.713814 kubelet[3188]: W0424 23:59:39.711863 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.713814 kubelet[3188]: E0424 23:59:39.711876 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.714759 kubelet[3188]: E0424 23:59:39.714481 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.714759 kubelet[3188]: W0424 23:59:39.714497 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.714759 kubelet[3188]: E0424 23:59:39.714514 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.715936 kubelet[3188]: E0424 23:59:39.715921 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.716199 kubelet[3188]: W0424 23:59:39.716030 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.716199 kubelet[3188]: E0424 23:59:39.716103 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.717132 kubelet[3188]: E0424 23:59:39.717118 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.717242 kubelet[3188]: W0424 23:59:39.717228 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.717336 kubelet[3188]: E0424 23:59:39.717322 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.718572 kubelet[3188]: E0424 23:59:39.718557 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.718994 kubelet[3188]: W0424 23:59:39.718708 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.718994 kubelet[3188]: E0424 23:59:39.718730 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:39.719570 kubelet[3188]: E0424 23:59:39.719554 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:39.719763 kubelet[3188]: W0424 23:59:39.719746 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:39.720015 kubelet[3188]: E0424 23:59:39.719998 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.604441 containerd[1974]: time="2026-04-24T23:59:40.604387345Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:40.610266 containerd[1974]: time="2026-04-24T23:59:40.610197445Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Apr 24 23:59:40.611839 containerd[1974]: time="2026-04-24T23:59:40.611757445Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:40.614677 containerd[1974]: time="2026-04-24T23:59:40.614562223Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:40.615614 containerd[1974]: time="2026-04-24T23:59:40.615409155Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.526805039s" Apr 24 23:59:40.615614 containerd[1974]: time="2026-04-24T23:59:40.615452036Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Apr 24 23:59:40.623174 containerd[1974]: time="2026-04-24T23:59:40.622782308Z" level=info msg="CreateContainer within sandbox \"e839515a5d527e392b4aa2c3a5cfb951c4e1bcb7ad480c87497cc26dd8464e19\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Apr 24 23:59:40.642248 containerd[1974]: time="2026-04-24T23:59:40.642085719Z" level=info msg="CreateContainer within sandbox \"e839515a5d527e392b4aa2c3a5cfb951c4e1bcb7ad480c87497cc26dd8464e19\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6f1fc132f65b53da9debfed65d113afe586e8dc831e2b8198d9599aa55b6097d\"" Apr 24 23:59:40.644807 containerd[1974]: time="2026-04-24T23:59:40.642975902Z" level=info msg="StartContainer for \"6f1fc132f65b53da9debfed65d113afe586e8dc831e2b8198d9599aa55b6097d\"" Apr 24 23:59:40.688011 systemd[1]: Started cri-containerd-6f1fc132f65b53da9debfed65d113afe586e8dc831e2b8198d9599aa55b6097d.scope - libcontainer container 6f1fc132f65b53da9debfed65d113afe586e8dc831e2b8198d9599aa55b6097d. Apr 24 23:59:40.706881 kubelet[3188]: E0424 23:59:40.706842 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.707325 kubelet[3188]: W0424 23:59:40.707301 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.707438 kubelet[3188]: E0424 23:59:40.707423 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.709236 kubelet[3188]: E0424 23:59:40.709217 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.709373 kubelet[3188]: W0424 23:59:40.709355 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.709495 kubelet[3188]: E0424 23:59:40.709478 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.711017 kubelet[3188]: E0424 23:59:40.710984 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.711157 kubelet[3188]: W0424 23:59:40.711142 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.711249 kubelet[3188]: E0424 23:59:40.711236 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.711772 kubelet[3188]: E0424 23:59:40.711759 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.711904 kubelet[3188]: W0424 23:59:40.711890 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.711995 kubelet[3188]: E0424 23:59:40.711983 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.712402 kubelet[3188]: E0424 23:59:40.712372 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.712681 kubelet[3188]: W0424 23:59:40.712531 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.712681 kubelet[3188]: E0424 23:59:40.712563 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.714394 kubelet[3188]: E0424 23:59:40.714235 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.714697 kubelet[3188]: W0424 23:59:40.714494 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.714697 kubelet[3188]: E0424 23:59:40.714516 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.718257 kubelet[3188]: E0424 23:59:40.717668 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.718257 kubelet[3188]: W0424 23:59:40.717683 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.718257 kubelet[3188]: E0424 23:59:40.717699 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.718257 kubelet[3188]: E0424 23:59:40.718014 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.718257 kubelet[3188]: W0424 23:59:40.718027 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.718257 kubelet[3188]: E0424 23:59:40.718042 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.719350 kubelet[3188]: E0424 23:59:40.718559 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.719350 kubelet[3188]: W0424 23:59:40.718779 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.719350 kubelet[3188]: E0424 23:59:40.718941 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.720442 kubelet[3188]: E0424 23:59:40.720246 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.720818 kubelet[3188]: W0424 23:59:40.720548 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.720818 kubelet[3188]: E0424 23:59:40.720569 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.723982 kubelet[3188]: E0424 23:59:40.723356 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.723982 kubelet[3188]: W0424 23:59:40.723374 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.723982 kubelet[3188]: E0424 23:59:40.723395 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.723982 kubelet[3188]: E0424 23:59:40.723614 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.723982 kubelet[3188]: W0424 23:59:40.723623 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.723982 kubelet[3188]: E0424 23:59:40.723635 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.723982 kubelet[3188]: E0424 23:59:40.723870 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.723982 kubelet[3188]: W0424 23:59:40.723879 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.723982 kubelet[3188]: E0424 23:59:40.723892 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.725581 kubelet[3188]: E0424 23:59:40.724762 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.725581 kubelet[3188]: W0424 23:59:40.724778 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.725581 kubelet[3188]: E0424 23:59:40.724822 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.725581 kubelet[3188]: E0424 23:59:40.725320 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.725581 kubelet[3188]: W0424 23:59:40.725333 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.725581 kubelet[3188]: E0424 23:59:40.725350 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.726146 kubelet[3188]: E0424 23:59:40.726027 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.726146 kubelet[3188]: W0424 23:59:40.726041 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.726146 kubelet[3188]: E0424 23:59:40.726055 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.726673 kubelet[3188]: E0424 23:59:40.726563 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.726673 kubelet[3188]: W0424 23:59:40.726578 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.726673 kubelet[3188]: E0424 23:59:40.726593 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.727259 kubelet[3188]: E0424 23:59:40.727151 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.727259 kubelet[3188]: W0424 23:59:40.727165 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.727259 kubelet[3188]: E0424 23:59:40.727179 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.728019 kubelet[3188]: E0424 23:59:40.727863 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.728019 kubelet[3188]: W0424 23:59:40.727879 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.728019 kubelet[3188]: E0424 23:59:40.727894 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.728510 kubelet[3188]: E0424 23:59:40.728482 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.728617 kubelet[3188]: W0424 23:59:40.728604 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.728731 kubelet[3188]: E0424 23:59:40.728717 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.729278 kubelet[3188]: E0424 23:59:40.729265 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.729408 kubelet[3188]: W0424 23:59:40.729376 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.729512 kubelet[3188]: E0424 23:59:40.729499 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.730199 kubelet[3188]: E0424 23:59:40.730171 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.730370 kubelet[3188]: W0424 23:59:40.730283 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.730370 kubelet[3188]: E0424 23:59:40.730301 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.731092 kubelet[3188]: E0424 23:59:40.730889 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.731092 kubelet[3188]: W0424 23:59:40.730920 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.731092 kubelet[3188]: E0424 23:59:40.730937 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.731735 kubelet[3188]: E0424 23:59:40.731721 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.732010 kubelet[3188]: W0424 23:59:40.731830 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.732010 kubelet[3188]: E0424 23:59:40.731846 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.732446 kubelet[3188]: E0424 23:59:40.732338 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.732446 kubelet[3188]: W0424 23:59:40.732353 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.732446 kubelet[3188]: E0424 23:59:40.732367 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.732984 kubelet[3188]: E0424 23:59:40.732843 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.732984 kubelet[3188]: W0424 23:59:40.732857 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.732984 kubelet[3188]: E0424 23:59:40.732895 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.733923 kubelet[3188]: E0424 23:59:40.733405 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.733923 kubelet[3188]: W0424 23:59:40.733420 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.733923 kubelet[3188]: E0424 23:59:40.733434 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.734307 kubelet[3188]: E0424 23:59:40.734292 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.734520 kubelet[3188]: W0424 23:59:40.734505 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.734615 kubelet[3188]: E0424 23:59:40.734602 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.735021 kubelet[3188]: E0424 23:59:40.735008 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.735124 kubelet[3188]: W0424 23:59:40.735111 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.735219 kubelet[3188]: E0424 23:59:40.735192 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.736139 kubelet[3188]: E0424 23:59:40.736123 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.736380 kubelet[3188]: W0424 23:59:40.736328 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.736380 kubelet[3188]: E0424 23:59:40.736348 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.737494 kubelet[3188]: E0424 23:59:40.737317 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.737494 kubelet[3188]: W0424 23:59:40.737331 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.737494 kubelet[3188]: E0424 23:59:40.737345 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.739008 kubelet[3188]: E0424 23:59:40.738587 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.739008 kubelet[3188]: W0424 23:59:40.738601 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.739008 kubelet[3188]: E0424 23:59:40.738613 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.741088 kubelet[3188]: E0424 23:59:40.740868 3188 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:59:40.741088 kubelet[3188]: W0424 23:59:40.740883 3188 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:59:40.741088 kubelet[3188]: E0424 23:59:40.740896 3188 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:59:40.746511 containerd[1974]: time="2026-04-24T23:59:40.746473396Z" level=info msg="StartContainer for \"6f1fc132f65b53da9debfed65d113afe586e8dc831e2b8198d9599aa55b6097d\" returns successfully" Apr 24 23:59:40.763475 systemd[1]: cri-containerd-6f1fc132f65b53da9debfed65d113afe586e8dc831e2b8198d9599aa55b6097d.scope: Deactivated successfully. Apr 24 23:59:40.955322 containerd[1974]: time="2026-04-24T23:59:40.927323662Z" level=info msg="shim disconnected" id=6f1fc132f65b53da9debfed65d113afe586e8dc831e2b8198d9599aa55b6097d namespace=k8s.io Apr 24 23:59:40.955597 containerd[1974]: time="2026-04-24T23:59:40.955327977Z" level=warning msg="cleaning up after shim disconnected" id=6f1fc132f65b53da9debfed65d113afe586e8dc831e2b8198d9599aa55b6097d namespace=k8s.io Apr 24 23:59:40.955597 containerd[1974]: time="2026-04-24T23:59:40.955348252Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:59:40.969587 containerd[1974]: time="2026-04-24T23:59:40.969532480Z" level=warning msg="cleanup warnings time=\"2026-04-24T23:59:40Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Apr 24 23:59:41.101563 systemd[1]: run-containerd-runc-k8s.io-6f1fc132f65b53da9debfed65d113afe586e8dc831e2b8198d9599aa55b6097d-runc.TnoCHg.mount: Deactivated successfully. Apr 24 23:59:41.101696 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6f1fc132f65b53da9debfed65d113afe586e8dc831e2b8198d9599aa55b6097d-rootfs.mount: Deactivated successfully. Apr 24 23:59:41.532314 kubelet[3188]: E0424 23:59:41.531001 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:41.664865 containerd[1974]: time="2026-04-24T23:59:41.664824522Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Apr 24 23:59:43.531617 kubelet[3188]: E0424 23:59:43.530369 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:45.532263 kubelet[3188]: E0424 23:59:45.530761 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:47.535672 kubelet[3188]: E0424 23:59:47.535618 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:49.530593 kubelet[3188]: E0424 23:59:49.530542 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:51.082936 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2726031525.mount: Deactivated successfully. Apr 24 23:59:51.134657 containerd[1974]: time="2026-04-24T23:59:51.127161841Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:51.134657 containerd[1974]: time="2026-04-24T23:59:51.134494505Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Apr 24 23:59:51.137071 containerd[1974]: time="2026-04-24T23:59:51.137034867Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:51.139398 containerd[1974]: time="2026-04-24T23:59:51.138846230Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:51.139893 containerd[1974]: time="2026-04-24T23:59:51.139859732Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 9.474989154s" Apr 24 23:59:51.140023 containerd[1974]: time="2026-04-24T23:59:51.140003473Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Apr 24 23:59:51.160748 containerd[1974]: time="2026-04-24T23:59:51.160705381Z" level=info msg="CreateContainer within sandbox \"e839515a5d527e392b4aa2c3a5cfb951c4e1bcb7ad480c87497cc26dd8464e19\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Apr 24 23:59:51.195361 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2109372474.mount: Deactivated successfully. Apr 24 23:59:51.201859 containerd[1974]: time="2026-04-24T23:59:51.201767385Z" level=info msg="CreateContainer within sandbox \"e839515a5d527e392b4aa2c3a5cfb951c4e1bcb7ad480c87497cc26dd8464e19\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"38bf63ba0a65642da0383535304b9682cecf84138c5d6368bbeb608eb8f1508e\"" Apr 24 23:59:51.202811 containerd[1974]: time="2026-04-24T23:59:51.202747362Z" level=info msg="StartContainer for \"38bf63ba0a65642da0383535304b9682cecf84138c5d6368bbeb608eb8f1508e\"" Apr 24 23:59:51.256119 systemd[1]: Started cri-containerd-38bf63ba0a65642da0383535304b9682cecf84138c5d6368bbeb608eb8f1508e.scope - libcontainer container 38bf63ba0a65642da0383535304b9682cecf84138c5d6368bbeb608eb8f1508e. Apr 24 23:59:51.306611 containerd[1974]: time="2026-04-24T23:59:51.306445639Z" level=info msg="StartContainer for \"38bf63ba0a65642da0383535304b9682cecf84138c5d6368bbeb608eb8f1508e\" returns successfully" Apr 24 23:59:51.355656 systemd[1]: cri-containerd-38bf63ba0a65642da0383535304b9682cecf84138c5d6368bbeb608eb8f1508e.scope: Deactivated successfully. Apr 24 23:59:51.533741 kubelet[3188]: E0424 23:59:51.533689 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:51.788940 containerd[1974]: time="2026-04-24T23:59:51.788774353Z" level=info msg="shim disconnected" id=38bf63ba0a65642da0383535304b9682cecf84138c5d6368bbeb608eb8f1508e namespace=k8s.io Apr 24 23:59:51.788940 containerd[1974]: time="2026-04-24T23:59:51.788863726Z" level=warning msg="cleaning up after shim disconnected" id=38bf63ba0a65642da0383535304b9682cecf84138c5d6368bbeb608eb8f1508e namespace=k8s.io Apr 24 23:59:51.788940 containerd[1974]: time="2026-04-24T23:59:51.788877186Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:59:51.804382 containerd[1974]: time="2026-04-24T23:59:51.804312758Z" level=warning msg="cleanup warnings time=\"2026-04-24T23:59:51Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Apr 24 23:59:52.082110 systemd[1]: run-containerd-runc-k8s.io-38bf63ba0a65642da0383535304b9682cecf84138c5d6368bbeb608eb8f1508e-runc.3ytY1i.mount: Deactivated successfully. Apr 24 23:59:52.082773 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-38bf63ba0a65642da0383535304b9682cecf84138c5d6368bbeb608eb8f1508e-rootfs.mount: Deactivated successfully. Apr 24 23:59:52.704231 containerd[1974]: time="2026-04-24T23:59:52.704188897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Apr 24 23:59:53.531844 kubelet[3188]: E0424 23:59:53.531255 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:55.531800 kubelet[3188]: E0424 23:59:55.531724 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:55.941371 containerd[1974]: time="2026-04-24T23:59:55.941322849Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:55.943213 containerd[1974]: time="2026-04-24T23:59:55.943156699Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Apr 24 23:59:55.945371 containerd[1974]: time="2026-04-24T23:59:55.944285636Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:55.946974 containerd[1974]: time="2026-04-24T23:59:55.946941156Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:59:55.947943 containerd[1974]: time="2026-04-24T23:59:55.947907018Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 3.24366706s" Apr 24 23:59:55.948087 containerd[1974]: time="2026-04-24T23:59:55.948062763Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Apr 24 23:59:55.959678 containerd[1974]: time="2026-04-24T23:59:55.959620965Z" level=info msg="CreateContainer within sandbox \"e839515a5d527e392b4aa2c3a5cfb951c4e1bcb7ad480c87497cc26dd8464e19\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Apr 24 23:59:55.976424 containerd[1974]: time="2026-04-24T23:59:55.976371094Z" level=info msg="CreateContainer within sandbox \"e839515a5d527e392b4aa2c3a5cfb951c4e1bcb7ad480c87497cc26dd8464e19\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"222c0296fe305986304e44cea9ff8bb36851533f94b2a1784c40622c145afa84\"" Apr 24 23:59:55.977140 containerd[1974]: time="2026-04-24T23:59:55.977045086Z" level=info msg="StartContainer for \"222c0296fe305986304e44cea9ff8bb36851533f94b2a1784c40622c145afa84\"" Apr 24 23:59:56.022014 systemd[1]: Started cri-containerd-222c0296fe305986304e44cea9ff8bb36851533f94b2a1784c40622c145afa84.scope - libcontainer container 222c0296fe305986304e44cea9ff8bb36851533f94b2a1784c40622c145afa84. Apr 24 23:59:56.062740 containerd[1974]: time="2026-04-24T23:59:56.062692763Z" level=info msg="StartContainer for \"222c0296fe305986304e44cea9ff8bb36851533f94b2a1784c40622c145afa84\" returns successfully" Apr 24 23:59:57.075620 systemd[1]: cri-containerd-222c0296fe305986304e44cea9ff8bb36851533f94b2a1784c40622c145afa84.scope: Deactivated successfully. Apr 24 23:59:57.112749 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-222c0296fe305986304e44cea9ff8bb36851533f94b2a1784c40622c145afa84-rootfs.mount: Deactivated successfully. Apr 24 23:59:57.121237 containerd[1974]: time="2026-04-24T23:59:57.121163979Z" level=info msg="shim disconnected" id=222c0296fe305986304e44cea9ff8bb36851533f94b2a1784c40622c145afa84 namespace=k8s.io Apr 24 23:59:57.123428 containerd[1974]: time="2026-04-24T23:59:57.121558762Z" level=warning msg="cleaning up after shim disconnected" id=222c0296fe305986304e44cea9ff8bb36851533f94b2a1784c40622c145afa84 namespace=k8s.io Apr 24 23:59:57.123428 containerd[1974]: time="2026-04-24T23:59:57.121579516Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:59:57.131371 kubelet[3188]: I0424 23:59:57.121286 3188 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Apr 24 23:59:57.141718 containerd[1974]: time="2026-04-24T23:59:57.141094255Z" level=warning msg="cleanup warnings time=\"2026-04-24T23:59:57Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Apr 24 23:59:57.367906 kubelet[3188]: I0424 23:59:57.367749 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/40ce18e5-a962-4a82-921b-38508661dc53-whisker-backend-key-pair\") pod \"whisker-6c4c456bdb-82zbl\" (UID: \"40ce18e5-a962-4a82-921b-38508661dc53\") " pod="calico-system/whisker-6c4c456bdb-82zbl" Apr 24 23:59:57.368461 kubelet[3188]: I0424 23:59:57.368114 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b17085d6-14a5-4b5e-81f9-314340899d1e-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-vg777\" (UID: \"b17085d6-14a5-4b5e-81f9-314340899d1e\") " pod="calico-system/goldmane-9f7667bb8-vg777" Apr 24 23:59:57.368461 kubelet[3188]: I0424 23:59:57.368149 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/40ce18e5-a962-4a82-921b-38508661dc53-nginx-config\") pod \"whisker-6c4c456bdb-82zbl\" (UID: \"40ce18e5-a962-4a82-921b-38508661dc53\") " pod="calico-system/whisker-6c4c456bdb-82zbl" Apr 24 23:59:57.368461 kubelet[3188]: I0424 23:59:57.368177 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9cb\" (UniqueName: \"kubernetes.io/projected/f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2-kube-api-access-sp9cb\") pod \"calico-apiserver-848b4486d7-brf5p\" (UID: \"f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2\") " pod="calico-system/calico-apiserver-848b4486d7-brf5p" Apr 24 23:59:57.368461 kubelet[3188]: I0424 23:59:57.368212 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b17085d6-14a5-4b5e-81f9-314340899d1e-config\") pod \"goldmane-9f7667bb8-vg777\" (UID: \"b17085d6-14a5-4b5e-81f9-314340899d1e\") " pod="calico-system/goldmane-9f7667bb8-vg777" Apr 24 23:59:57.368461 kubelet[3188]: I0424 23:59:57.368239 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knddk\" (UniqueName: \"kubernetes.io/projected/40ce18e5-a962-4a82-921b-38508661dc53-kube-api-access-knddk\") pod \"whisker-6c4c456bdb-82zbl\" (UID: \"40ce18e5-a962-4a82-921b-38508661dc53\") " pod="calico-system/whisker-6c4c456bdb-82zbl" Apr 24 23:59:57.368654 kubelet[3188]: I0424 23:59:57.368268 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj4m2\" (UniqueName: \"kubernetes.io/projected/e3d6ca38-0ac1-41b2-beef-170f1942102f-kube-api-access-mj4m2\") pod \"calico-kube-controllers-7564df694c-4d42l\" (UID: \"e3d6ca38-0ac1-41b2-beef-170f1942102f\") " pod="calico-system/calico-kube-controllers-7564df694c-4d42l" Apr 24 23:59:57.368654 kubelet[3188]: I0424 23:59:57.368292 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/b17085d6-14a5-4b5e-81f9-314340899d1e-goldmane-key-pair\") pod \"goldmane-9f7667bb8-vg777\" (UID: \"b17085d6-14a5-4b5e-81f9-314340899d1e\") " pod="calico-system/goldmane-9f7667bb8-vg777" Apr 24 23:59:57.368654 kubelet[3188]: I0424 23:59:57.368311 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/651a4b24-dccb-49b4-b5f3-a4291d6e49f0-config-volume\") pod \"coredns-7d764666f9-zk545\" (UID: \"651a4b24-dccb-49b4-b5f3-a4291d6e49f0\") " pod="kube-system/coredns-7d764666f9-zk545" Apr 24 23:59:57.368654 kubelet[3188]: I0424 23:59:57.368329 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq48c\" (UniqueName: \"kubernetes.io/projected/b17085d6-14a5-4b5e-81f9-314340899d1e-kube-api-access-vq48c\") pod \"goldmane-9f7667bb8-vg777\" (UID: \"b17085d6-14a5-4b5e-81f9-314340899d1e\") " pod="calico-system/goldmane-9f7667bb8-vg777" Apr 24 23:59:57.368654 kubelet[3188]: I0424 23:59:57.368361 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sb27\" (UniqueName: \"kubernetes.io/projected/d9ba9417-7859-4ec0-8a74-659edbbec7c4-kube-api-access-7sb27\") pod \"calico-apiserver-848b4486d7-dvwgq\" (UID: \"d9ba9417-7859-4ec0-8a74-659edbbec7c4\") " pod="calico-system/calico-apiserver-848b4486d7-dvwgq" Apr 24 23:59:57.370526 kubelet[3188]: I0424 23:59:57.370344 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2-calico-apiserver-certs\") pod \"calico-apiserver-848b4486d7-brf5p\" (UID: \"f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2\") " pod="calico-system/calico-apiserver-848b4486d7-brf5p" Apr 24 23:59:57.370526 kubelet[3188]: I0424 23:59:57.370393 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d9ba9417-7859-4ec0-8a74-659edbbec7c4-calico-apiserver-certs\") pod \"calico-apiserver-848b4486d7-dvwgq\" (UID: \"d9ba9417-7859-4ec0-8a74-659edbbec7c4\") " pod="calico-system/calico-apiserver-848b4486d7-dvwgq" Apr 24 23:59:57.370526 kubelet[3188]: I0424 23:59:57.370421 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8259ae9f-09c1-45c5-a26a-9fb19e805b35-config-volume\") pod \"coredns-7d764666f9-q5wxg\" (UID: \"8259ae9f-09c1-45c5-a26a-9fb19e805b35\") " pod="kube-system/coredns-7d764666f9-q5wxg" Apr 24 23:59:57.370526 kubelet[3188]: I0424 23:59:57.370446 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwzc4\" (UniqueName: \"kubernetes.io/projected/651a4b24-dccb-49b4-b5f3-a4291d6e49f0-kube-api-access-nwzc4\") pod \"coredns-7d764666f9-zk545\" (UID: \"651a4b24-dccb-49b4-b5f3-a4291d6e49f0\") " pod="kube-system/coredns-7d764666f9-zk545" Apr 24 23:59:57.370886 kubelet[3188]: I0424 23:59:57.370762 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3d6ca38-0ac1-41b2-beef-170f1942102f-tigera-ca-bundle\") pod \"calico-kube-controllers-7564df694c-4d42l\" (UID: \"e3d6ca38-0ac1-41b2-beef-170f1942102f\") " pod="calico-system/calico-kube-controllers-7564df694c-4d42l" Apr 24 23:59:57.370886 kubelet[3188]: I0424 23:59:57.370831 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40ce18e5-a962-4a82-921b-38508661dc53-whisker-ca-bundle\") pod \"whisker-6c4c456bdb-82zbl\" (UID: \"40ce18e5-a962-4a82-921b-38508661dc53\") " pod="calico-system/whisker-6c4c456bdb-82zbl" Apr 24 23:59:57.370886 kubelet[3188]: I0424 23:59:57.370869 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pln\" (UniqueName: \"kubernetes.io/projected/8259ae9f-09c1-45c5-a26a-9fb19e805b35-kube-api-access-67pln\") pod \"coredns-7d764666f9-q5wxg\" (UID: \"8259ae9f-09c1-45c5-a26a-9fb19e805b35\") " pod="kube-system/coredns-7d764666f9-q5wxg" Apr 24 23:59:57.397290 systemd[1]: Created slice kubepods-burstable-pod8259ae9f_09c1_45c5_a26a_9fb19e805b35.slice - libcontainer container kubepods-burstable-pod8259ae9f_09c1_45c5_a26a_9fb19e805b35.slice. Apr 24 23:59:57.405672 systemd[1]: Created slice kubepods-burstable-pod651a4b24_dccb_49b4_b5f3_a4291d6e49f0.slice - libcontainer container kubepods-burstable-pod651a4b24_dccb_49b4_b5f3_a4291d6e49f0.slice. Apr 24 23:59:57.415264 systemd[1]: Created slice kubepods-besteffort-podf1ad7ed5_b378_4647_9ebf_2546b6a4e4a2.slice - libcontainer container kubepods-besteffort-podf1ad7ed5_b378_4647_9ebf_2546b6a4e4a2.slice. Apr 24 23:59:57.430448 systemd[1]: Created slice kubepods-besteffort-pod40ce18e5_a962_4a82_921b_38508661dc53.slice - libcontainer container kubepods-besteffort-pod40ce18e5_a962_4a82_921b_38508661dc53.slice. Apr 24 23:59:57.441290 systemd[1]: Created slice kubepods-besteffort-pode3d6ca38_0ac1_41b2_beef_170f1942102f.slice - libcontainer container kubepods-besteffort-pode3d6ca38_0ac1_41b2_beef_170f1942102f.slice. Apr 24 23:59:57.449428 systemd[1]: Created slice kubepods-besteffort-podb17085d6_14a5_4b5e_81f9_314340899d1e.slice - libcontainer container kubepods-besteffort-podb17085d6_14a5_4b5e_81f9_314340899d1e.slice. Apr 24 23:59:57.465046 systemd[1]: Created slice kubepods-besteffort-podd9ba9417_7859_4ec0_8a74_659edbbec7c4.slice - libcontainer container kubepods-besteffort-podd9ba9417_7859_4ec0_8a74_659edbbec7c4.slice. Apr 24 23:59:57.570984 systemd[1]: Created slice kubepods-besteffort-podfafabee1_df27_491d_a48c_611faa0cd932.slice - libcontainer container kubepods-besteffort-podfafabee1_df27_491d_a48c_611faa0cd932.slice. Apr 24 23:59:57.580726 containerd[1974]: time="2026-04-24T23:59:57.580688143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wllpb,Uid:fafabee1-df27-491d-a48c-611faa0cd932,Namespace:calico-system,Attempt:0,}" Apr 24 23:59:57.712648 containerd[1974]: time="2026-04-24T23:59:57.712597053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-q5wxg,Uid:8259ae9f-09c1-45c5-a26a-9fb19e805b35,Namespace:kube-system,Attempt:0,}" Apr 24 23:59:57.724154 containerd[1974]: time="2026-04-24T23:59:57.724115531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zk545,Uid:651a4b24-dccb-49b4-b5f3-a4291d6e49f0,Namespace:kube-system,Attempt:0,}" Apr 24 23:59:57.726611 containerd[1974]: time="2026-04-24T23:59:57.725986349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848b4486d7-brf5p,Uid:f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2,Namespace:calico-system,Attempt:0,}" Apr 24 23:59:57.744253 containerd[1974]: time="2026-04-24T23:59:57.743149410Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c4c456bdb-82zbl,Uid:40ce18e5-a962-4a82-921b-38508661dc53,Namespace:calico-system,Attempt:0,}" Apr 24 23:59:57.756815 containerd[1974]: time="2026-04-24T23:59:57.756499904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7564df694c-4d42l,Uid:e3d6ca38-0ac1-41b2-beef-170f1942102f,Namespace:calico-system,Attempt:0,}" Apr 24 23:59:57.765204 containerd[1974]: time="2026-04-24T23:59:57.765155186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-vg777,Uid:b17085d6-14a5-4b5e-81f9-314340899d1e,Namespace:calico-system,Attempt:0,}" Apr 24 23:59:57.779808 containerd[1974]: time="2026-04-24T23:59:57.779756084Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848b4486d7-dvwgq,Uid:d9ba9417-7859-4ec0-8a74-659edbbec7c4,Namespace:calico-system,Attempt:0,}" Apr 24 23:59:57.782529 containerd[1974]: time="2026-04-24T23:59:57.782391806Z" level=info msg="CreateContainer within sandbox \"e839515a5d527e392b4aa2c3a5cfb951c4e1bcb7ad480c87497cc26dd8464e19\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Apr 24 23:59:57.945698 containerd[1974]: time="2026-04-24T23:59:57.945507862Z" level=info msg="CreateContainer within sandbox \"e839515a5d527e392b4aa2c3a5cfb951c4e1bcb7ad480c87497cc26dd8464e19\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"19b22f5fcafa74b196e5a6566598d3258ab5724f41bf8ad07aa927cccda220a2\"" Apr 24 23:59:57.948664 containerd[1974]: time="2026-04-24T23:59:57.948442278Z" level=info msg="StartContainer for \"19b22f5fcafa74b196e5a6566598d3258ab5724f41bf8ad07aa927cccda220a2\"" Apr 24 23:59:58.112028 systemd[1]: Started cri-containerd-19b22f5fcafa74b196e5a6566598d3258ab5724f41bf8ad07aa927cccda220a2.scope - libcontainer container 19b22f5fcafa74b196e5a6566598d3258ab5724f41bf8ad07aa927cccda220a2. Apr 24 23:59:58.320830 containerd[1974]: time="2026-04-24T23:59:58.319473632Z" level=info msg="StartContainer for \"19b22f5fcafa74b196e5a6566598d3258ab5724f41bf8ad07aa927cccda220a2\" returns successfully" Apr 24 23:59:58.396835 containerd[1974]: time="2026-04-24T23:59:58.396747668Z" level=error msg="Failed to destroy network for sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.402109 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9-shm.mount: Deactivated successfully. Apr 24 23:59:58.425186 containerd[1974]: time="2026-04-24T23:59:58.424472122Z" level=error msg="Failed to destroy network for sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.430114 containerd[1974]: time="2026-04-24T23:59:58.430056627Z" level=error msg="encountered an error cleaning up failed sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.430358 containerd[1974]: time="2026-04-24T23:59:58.430325754Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c4c456bdb-82zbl,Uid:40ce18e5-a962-4a82-921b-38508661dc53,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.432113 containerd[1974]: time="2026-04-24T23:59:58.431359900Z" level=error msg="Failed to destroy network for sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.437525 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef-shm.mount: Deactivated successfully. Apr 24 23:59:58.438074 containerd[1974]: time="2026-04-24T23:59:58.438034234Z" level=error msg="encountered an error cleaning up failed sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.438227 containerd[1974]: time="2026-04-24T23:59:58.438201797Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7564df694c-4d42l,Uid:e3d6ca38-0ac1-41b2-beef-170f1942102f,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.438885 kubelet[3188]: E0424 23:59:58.438695 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.440231 kubelet[3188]: E0424 23:59:58.439824 3188 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6c4c456bdb-82zbl" Apr 24 23:59:58.440231 kubelet[3188]: E0424 23:59:58.439873 3188 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6c4c456bdb-82zbl" Apr 24 23:59:58.440380 containerd[1974]: time="2026-04-24T23:59:58.438610917Z" level=error msg="encountered an error cleaning up failed sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.440380 containerd[1974]: time="2026-04-24T23:59:58.440053853Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wllpb,Uid:fafabee1-df27-491d-a48c-611faa0cd932,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.440380 containerd[1974]: time="2026-04-24T23:59:58.438619119Z" level=error msg="Failed to destroy network for sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.441933 kubelet[3188]: E0424 23:59:58.439954 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6c4c456bdb-82zbl_calico-system(40ce18e5-a962-4a82-921b-38508661dc53)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6c4c456bdb-82zbl_calico-system(40ce18e5-a962-4a82-921b-38508661dc53)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6c4c456bdb-82zbl" podUID="40ce18e5-a962-4a82-921b-38508661dc53" Apr 24 23:59:58.443916 kubelet[3188]: E0424 23:59:58.443338 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.443916 kubelet[3188]: E0424 23:59:58.443411 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.443916 kubelet[3188]: E0424 23:59:58.443448 3188 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7564df694c-4d42l" Apr 24 23:59:58.443916 kubelet[3188]: E0424 23:59:58.443476 3188 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7564df694c-4d42l" Apr 24 23:59:58.444155 kubelet[3188]: E0424 23:59:58.443535 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7564df694c-4d42l_calico-system(e3d6ca38-0ac1-41b2-beef-170f1942102f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7564df694c-4d42l_calico-system(e3d6ca38-0ac1-41b2-beef-170f1942102f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7564df694c-4d42l" podUID="e3d6ca38-0ac1-41b2-beef-170f1942102f" Apr 24 23:59:58.444155 kubelet[3188]: E0424 23:59:58.443690 3188 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-wllpb" Apr 24 23:59:58.444155 kubelet[3188]: E0424 23:59:58.443715 3188 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-wllpb" Apr 24 23:59:58.444340 kubelet[3188]: E0424 23:59:58.443859 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-wllpb_calico-system(fafabee1-df27-491d-a48c-611faa0cd932)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-wllpb_calico-system(fafabee1-df27-491d-a48c-611faa0cd932)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:58.444713 containerd[1974]: time="2026-04-24T23:59:58.444679486Z" level=error msg="encountered an error cleaning up failed sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.444977 containerd[1974]: time="2026-04-24T23:59:58.444945241Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-q5wxg,Uid:8259ae9f-09c1-45c5-a26a-9fb19e805b35,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.445515 kubelet[3188]: E0424 23:59:58.445318 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.445515 kubelet[3188]: E0424 23:59:58.445356 3188 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-q5wxg" Apr 24 23:59:58.445515 kubelet[3188]: E0424 23:59:58.445373 3188 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-q5wxg" Apr 24 23:59:58.445754 kubelet[3188]: E0424 23:59:58.445432 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-q5wxg_kube-system(8259ae9f-09c1-45c5-a26a-9fb19e805b35)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-q5wxg_kube-system(8259ae9f-09c1-45c5-a26a-9fb19e805b35)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-q5wxg" podUID="8259ae9f-09c1-45c5-a26a-9fb19e805b35" Apr 24 23:59:58.448076 containerd[1974]: time="2026-04-24T23:59:58.447296389Z" level=error msg="Failed to destroy network for sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.448076 containerd[1974]: time="2026-04-24T23:59:58.447635084Z" level=error msg="encountered an error cleaning up failed sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.448076 containerd[1974]: time="2026-04-24T23:59:58.447685729Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zk545,Uid:651a4b24-dccb-49b4-b5f3-a4291d6e49f0,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.448428 kubelet[3188]: E0424 23:59:58.447858 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.448428 kubelet[3188]: E0424 23:59:58.447919 3188 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-zk545" Apr 24 23:59:58.448428 kubelet[3188]: E0424 23:59:58.447940 3188 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-zk545" Apr 24 23:59:58.448581 kubelet[3188]: E0424 23:59:58.448007 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-zk545_kube-system(651a4b24-dccb-49b4-b5f3-a4291d6e49f0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-zk545_kube-system(651a4b24-dccb-49b4-b5f3-a4291d6e49f0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-zk545" podUID="651a4b24-dccb-49b4-b5f3-a4291d6e49f0" Apr 24 23:59:58.450848 containerd[1974]: time="2026-04-24T23:59:58.449047224Z" level=error msg="Failed to destroy network for sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.450848 containerd[1974]: time="2026-04-24T23:59:58.450345641Z" level=error msg="encountered an error cleaning up failed sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.450848 containerd[1974]: time="2026-04-24T23:59:58.450672146Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-vg777,Uid:b17085d6-14a5-4b5e-81f9-314340899d1e,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.450518 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2-shm.mount: Deactivated successfully. Apr 24 23:59:58.450807 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703-shm.mount: Deactivated successfully. Apr 24 23:59:58.451690 kubelet[3188]: E0424 23:59:58.451331 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.451690 kubelet[3188]: E0424 23:59:58.451368 3188 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-vg777" Apr 24 23:59:58.451690 kubelet[3188]: E0424 23:59:58.451394 3188 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-vg777" Apr 24 23:59:58.451860 kubelet[3188]: E0424 23:59:58.451441 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-vg777_calico-system(b17085d6-14a5-4b5e-81f9-314340899d1e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-vg777_calico-system(b17085d6-14a5-4b5e-81f9-314340899d1e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-vg777" podUID="b17085d6-14a5-4b5e-81f9-314340899d1e" Apr 24 23:59:58.459570 containerd[1974]: time="2026-04-24T23:59:58.459038980Z" level=error msg="Failed to destroy network for sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.459570 containerd[1974]: time="2026-04-24T23:59:58.459438092Z" level=error msg="encountered an error cleaning up failed sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.459570 containerd[1974]: time="2026-04-24T23:59:58.459504149Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848b4486d7-brf5p,Uid:f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.460294 kubelet[3188]: E0424 23:59:58.460039 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.460294 kubelet[3188]: E0424 23:59:58.460093 3188 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-848b4486d7-brf5p" Apr 24 23:59:58.460294 kubelet[3188]: E0424 23:59:58.460116 3188 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-848b4486d7-brf5p" Apr 24 23:59:58.460503 kubelet[3188]: E0424 23:59:58.460198 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-848b4486d7-brf5p_calico-system(f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-848b4486d7-brf5p_calico-system(f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-848b4486d7-brf5p" podUID="f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2" Apr 24 23:59:58.460963 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe-shm.mount: Deactivated successfully. Apr 24 23:59:58.461107 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9-shm.mount: Deactivated successfully. Apr 24 23:59:58.468914 containerd[1974]: time="2026-04-24T23:59:58.468740337Z" level=error msg="Failed to destroy network for sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.469423 containerd[1974]: time="2026-04-24T23:59:58.469297619Z" level=error msg="encountered an error cleaning up failed sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.469423 containerd[1974]: time="2026-04-24T23:59:58.469367104Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848b4486d7-dvwgq,Uid:d9ba9417-7859-4ec0-8a74-659edbbec7c4,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.470110 kubelet[3188]: E0424 23:59:58.469908 3188 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:58.470110 kubelet[3188]: E0424 23:59:58.469967 3188 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-848b4486d7-dvwgq" Apr 24 23:59:58.470110 kubelet[3188]: E0424 23:59:58.469983 3188 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-848b4486d7-dvwgq" Apr 24 23:59:58.470328 kubelet[3188]: E0424 23:59:58.470049 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-848b4486d7-dvwgq_calico-system(d9ba9417-7859-4ec0-8a74-659edbbec7c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-848b4486d7-dvwgq_calico-system(d9ba9417-7859-4ec0-8a74-659edbbec7c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-848b4486d7-dvwgq" podUID="d9ba9417-7859-4ec0-8a74-659edbbec7c4" Apr 24 23:59:58.742419 kubelet[3188]: I0424 23:59:58.741834 3188 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 24 23:59:58.773435 containerd[1974]: time="2026-04-24T23:59:58.773373401Z" level=info msg="StopPodSandbox for \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\"" Apr 24 23:59:58.776491 containerd[1974]: time="2026-04-24T23:59:58.776426532Z" level=info msg="Ensure that sandbox bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703 in task-service has been cleanup successfully" Apr 24 23:59:58.785716 kubelet[3188]: I0424 23:59:58.783694 3188 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 24 23:59:58.786686 containerd[1974]: time="2026-04-24T23:59:58.786610689Z" level=info msg="StopPodSandbox for \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\"" Apr 24 23:59:58.787122 containerd[1974]: time="2026-04-24T23:59:58.787082287Z" level=info msg="Ensure that sandbox a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe in task-service has been cleanup successfully" Apr 24 23:59:58.804556 kubelet[3188]: I0424 23:59:58.804521 3188 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 24 23:59:58.817438 containerd[1974]: time="2026-04-24T23:59:58.816430768Z" level=info msg="StopPodSandbox for \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\"" Apr 24 23:59:58.817438 containerd[1974]: time="2026-04-24T23:59:58.817234697Z" level=info msg="Ensure that sandbox 47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b in task-service has been cleanup successfully" Apr 24 23:59:58.836478 kubelet[3188]: I0424 23:59:58.836446 3188 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 24 23:59:58.846299 containerd[1974]: time="2026-04-24T23:59:58.846257010Z" level=info msg="StopPodSandbox for \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\"" Apr 24 23:59:58.847808 containerd[1974]: time="2026-04-24T23:59:58.846636058Z" level=info msg="Ensure that sandbox 2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef in task-service has been cleanup successfully" Apr 24 23:59:58.847915 kubelet[3188]: I0424 23:59:58.847036 3188 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 24 23:59:58.855984 containerd[1974]: time="2026-04-24T23:59:58.855935219Z" level=info msg="StopPodSandbox for \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\"" Apr 24 23:59:58.856185 containerd[1974]: time="2026-04-24T23:59:58.856155649Z" level=info msg="Ensure that sandbox 3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9 in task-service has been cleanup successfully" Apr 24 23:59:58.857944 kubelet[3188]: I0424 23:59:58.857902 3188 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 24 23:59:58.865285 containerd[1974]: time="2026-04-24T23:59:58.865144103Z" level=info msg="StopPodSandbox for \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\"" Apr 24 23:59:58.865408 containerd[1974]: time="2026-04-24T23:59:58.865376356Z" level=info msg="Ensure that sandbox 6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2 in task-service has been cleanup successfully" Apr 24 23:59:58.892765 kubelet[3188]: I0424 23:59:58.892685 3188 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 24 23:59:58.898589 containerd[1974]: time="2026-04-24T23:59:58.898146768Z" level=info msg="StopPodSandbox for \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\"" Apr 24 23:59:58.898589 containerd[1974]: time="2026-04-24T23:59:58.898385532Z" level=info msg="Ensure that sandbox a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9 in task-service has been cleanup successfully" Apr 24 23:59:58.899594 kubelet[3188]: I0424 23:59:58.899436 3188 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 24 23:59:58.903184 containerd[1974]: time="2026-04-24T23:59:58.903013752Z" level=info msg="StopPodSandbox for \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\"" Apr 24 23:59:58.903597 containerd[1974]: time="2026-04-24T23:59:58.903557441Z" level=info msg="Ensure that sandbox 2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9 in task-service has been cleanup successfully" Apr 24 23:59:59.010841 containerd[1974]: time="2026-04-24T23:59:59.009834998Z" level=error msg="StopPodSandbox for \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\" failed" error="failed to destroy network for sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:59.012397 kubelet[3188]: E0424 23:59:59.010132 3188 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 24 23:59:59.012397 kubelet[3188]: E0424 23:59:59.010199 3188 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9"} Apr 24 23:59:59.012397 kubelet[3188]: E0424 23:59:59.010270 3188 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"40ce18e5-a962-4a82-921b-38508661dc53\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:59:59.012397 kubelet[3188]: E0424 23:59:59.010307 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"40ce18e5-a962-4a82-921b-38508661dc53\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6c4c456bdb-82zbl" podUID="40ce18e5-a962-4a82-921b-38508661dc53" Apr 24 23:59:59.031731 containerd[1974]: time="2026-04-24T23:59:59.030912477Z" level=error msg="StopPodSandbox for \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\" failed" error="failed to destroy network for sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:59.032331 kubelet[3188]: E0424 23:59:59.031272 3188 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 24 23:59:59.032331 kubelet[3188]: E0424 23:59:59.031321 3188 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2"} Apr 24 23:59:59.032331 kubelet[3188]: E0424 23:59:59.031364 3188 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"e3d6ca38-0ac1-41b2-beef-170f1942102f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:59:59.032331 kubelet[3188]: E0424 23:59:59.031397 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"e3d6ca38-0ac1-41b2-beef-170f1942102f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7564df694c-4d42l" podUID="e3d6ca38-0ac1-41b2-beef-170f1942102f" Apr 24 23:59:59.056477 containerd[1974]: time="2026-04-24T23:59:59.056411368Z" level=error msg="StopPodSandbox for \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\" failed" error="failed to destroy network for sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:59.056841 kubelet[3188]: E0424 23:59:59.056751 3188 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 24 23:59:59.056841 kubelet[3188]: E0424 23:59:59.056827 3188 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703"} Apr 24 23:59:59.058870 kubelet[3188]: E0424 23:59:59.056872 3188 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8259ae9f-09c1-45c5-a26a-9fb19e805b35\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:59:59.058870 kubelet[3188]: E0424 23:59:59.056911 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8259ae9f-09c1-45c5-a26a-9fb19e805b35\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-q5wxg" podUID="8259ae9f-09c1-45c5-a26a-9fb19e805b35" Apr 24 23:59:59.092705 containerd[1974]: time="2026-04-24T23:59:59.092647515Z" level=error msg="StopPodSandbox for \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\" failed" error="failed to destroy network for sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:59.094861 kubelet[3188]: E0424 23:59:59.093609 3188 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 24 23:59:59.094861 kubelet[3188]: E0424 23:59:59.093663 3188 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe"} Apr 24 23:59:59.094861 kubelet[3188]: E0424 23:59:59.093705 3188 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"b17085d6-14a5-4b5e-81f9-314340899d1e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:59:59.094861 kubelet[3188]: E0424 23:59:59.093743 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"b17085d6-14a5-4b5e-81f9-314340899d1e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-vg777" podUID="b17085d6-14a5-4b5e-81f9-314340899d1e" Apr 24 23:59:59.097304 containerd[1974]: time="2026-04-24T23:59:59.097138626Z" level=error msg="StopPodSandbox for \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\" failed" error="failed to destroy network for sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:59.097418 kubelet[3188]: E0424 23:59:59.097361 3188 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 24 23:59:59.097418 kubelet[3188]: E0424 23:59:59.097406 3188 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b"} Apr 24 23:59:59.097525 kubelet[3188]: E0424 23:59:59.097452 3188 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:59:59.097525 kubelet[3188]: E0424 23:59:59.097490 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-848b4486d7-brf5p" podUID="f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2" Apr 24 23:59:59.107030 containerd[1974]: time="2026-04-24T23:59:59.106961232Z" level=error msg="StopPodSandbox for \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\" failed" error="failed to destroy network for sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:59.107388 kubelet[3188]: E0424 23:59:59.107342 3188 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 24 23:59:59.107488 kubelet[3188]: E0424 23:59:59.107416 3188 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9"} Apr 24 23:59:59.107488 kubelet[3188]: E0424 23:59:59.107475 3188 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"651a4b24-dccb-49b4-b5f3-a4291d6e49f0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:59:59.107642 kubelet[3188]: E0424 23:59:59.107514 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"651a4b24-dccb-49b4-b5f3-a4291d6e49f0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-zk545" podUID="651a4b24-dccb-49b4-b5f3-a4291d6e49f0" Apr 24 23:59:59.123496 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9-shm.mount: Deactivated successfully. Apr 24 23:59:59.123647 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b-shm.mount: Deactivated successfully. Apr 24 23:59:59.132547 containerd[1974]: time="2026-04-24T23:59:59.132493222Z" level=error msg="StopPodSandbox for \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\" failed" error="failed to destroy network for sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:59.133150 kubelet[3188]: E0424 23:59:59.132964 3188 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 24 23:59:59.133150 kubelet[3188]: E0424 23:59:59.133023 3188 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9"} Apr 24 23:59:59.133150 kubelet[3188]: E0424 23:59:59.133062 3188 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d9ba9417-7859-4ec0-8a74-659edbbec7c4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:59:59.133150 kubelet[3188]: E0424 23:59:59.133100 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d9ba9417-7859-4ec0-8a74-659edbbec7c4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-848b4486d7-dvwgq" podUID="d9ba9417-7859-4ec0-8a74-659edbbec7c4" Apr 24 23:59:59.139326 containerd[1974]: time="2026-04-24T23:59:59.139269097Z" level=error msg="StopPodSandbox for \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\" failed" error="failed to destroy network for sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:59:59.139580 kubelet[3188]: E0424 23:59:59.139525 3188 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 24 23:59:59.139686 kubelet[3188]: E0424 23:59:59.139577 3188 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef"} Apr 24 23:59:59.139686 kubelet[3188]: E0424 23:59:59.139618 3188 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"fafabee1-df27-491d-a48c-611faa0cd932\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:59:59.139686 kubelet[3188]: E0424 23:59:59.139656 3188 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"fafabee1-df27-491d-a48c-611faa0cd932\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-wllpb" podUID="fafabee1-df27-491d-a48c-611faa0cd932" Apr 24 23:59:59.360694 kubelet[3188]: I0424 23:59:59.360200 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-hfhdv" podStartSLOduration=2.428214394 podStartE2EDuration="24.354130709s" podCreationTimestamp="2026-04-24 23:59:35 +0000 UTC" firstStartedPulling="2026-04-24 23:59:35.819910999 +0000 UTC m=+18.489039718" lastFinishedPulling="2026-04-24 23:59:57.745827298 +0000 UTC m=+40.414956033" observedRunningTime="2026-04-24 23:59:58.802962253 +0000 UTC m=+41.472090994" watchObservedRunningTime="2026-04-24 23:59:59.354130709 +0000 UTC m=+42.023259461" Apr 24 23:59:59.903804 containerd[1974]: time="2026-04-24T23:59:59.903475768Z" level=info msg="StopPodSandbox for \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\"" Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.039 [INFO][4629] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.041 [INFO][4629] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" iface="eth0" netns="/var/run/netns/cni-0da4512a-e4ec-e5cb-71da-5f19a3c96a49" Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.042 [INFO][4629] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" iface="eth0" netns="/var/run/netns/cni-0da4512a-e4ec-e5cb-71da-5f19a3c96a49" Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.044 [INFO][4629] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" iface="eth0" netns="/var/run/netns/cni-0da4512a-e4ec-e5cb-71da-5f19a3c96a49" Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.044 [INFO][4629] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.045 [INFO][4629] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.139 [INFO][4650] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" HandleID="k8s-pod-network.a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Workload="ip--172--31--31--110-k8s-whisker--6c4c456bdb--82zbl-eth0" Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.139 [INFO][4650] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.139 [INFO][4650] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.146 [WARNING][4650] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" HandleID="k8s-pod-network.a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Workload="ip--172--31--31--110-k8s-whisker--6c4c456bdb--82zbl-eth0" Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.146 [INFO][4650] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" HandleID="k8s-pod-network.a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Workload="ip--172--31--31--110-k8s-whisker--6c4c456bdb--82zbl-eth0" Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.148 [INFO][4650] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:00.153691 containerd[1974]: 2026-04-25 00:00:00.151 [INFO][4629] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:00.155874 containerd[1974]: time="2026-04-25T00:00:00.155686425Z" level=info msg="TearDown network for sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\" successfully" Apr 25 00:00:00.155874 containerd[1974]: time="2026-04-25T00:00:00.155739447Z" level=info msg="StopPodSandbox for \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\" returns successfully" Apr 25 00:00:00.158553 systemd[1]: run-netns-cni\x2d0da4512a\x2de4ec\x2de5cb\x2d71da\x2d5f19a3c96a49.mount: Deactivated successfully. Apr 25 00:00:00.183319 systemd[1]: Started logrotate.service - Rotate and Compress System Logs. Apr 25 00:00:00.241909 systemd[1]: logrotate.service: Deactivated successfully. Apr 25 00:00:00.297466 kubelet[3188]: I0425 00:00:00.296984 3188 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/40ce18e5-a962-4a82-921b-38508661dc53-kube-api-access-knddk\" (UniqueName: \"kubernetes.io/projected/40ce18e5-a962-4a82-921b-38508661dc53-kube-api-access-knddk\") pod \"40ce18e5-a962-4a82-921b-38508661dc53\" (UID: \"40ce18e5-a962-4a82-921b-38508661dc53\") " Apr 25 00:00:00.297466 kubelet[3188]: I0425 00:00:00.297073 3188 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/40ce18e5-a962-4a82-921b-38508661dc53-nginx-config\" (UniqueName: \"kubernetes.io/configmap/40ce18e5-a962-4a82-921b-38508661dc53-nginx-config\") pod \"40ce18e5-a962-4a82-921b-38508661dc53\" (UID: \"40ce18e5-a962-4a82-921b-38508661dc53\") " Apr 25 00:00:00.297466 kubelet[3188]: I0425 00:00:00.297113 3188 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/40ce18e5-a962-4a82-921b-38508661dc53-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40ce18e5-a962-4a82-921b-38508661dc53-whisker-ca-bundle\") pod \"40ce18e5-a962-4a82-921b-38508661dc53\" (UID: \"40ce18e5-a962-4a82-921b-38508661dc53\") " Apr 25 00:00:00.297466 kubelet[3188]: I0425 00:00:00.297149 3188 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/40ce18e5-a962-4a82-921b-38508661dc53-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/40ce18e5-a962-4a82-921b-38508661dc53-whisker-backend-key-pair\") pod \"40ce18e5-a962-4a82-921b-38508661dc53\" (UID: \"40ce18e5-a962-4a82-921b-38508661dc53\") " Apr 25 00:00:00.307595 systemd[1]: var-lib-kubelet-pods-40ce18e5\x2da962\x2d4a82\x2d921b\x2d38508661dc53-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Apr 25 00:00:00.307955 systemd[1]: var-lib-kubelet-pods-40ce18e5\x2da962\x2d4a82\x2d921b\x2d38508661dc53-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dknddk.mount: Deactivated successfully. Apr 25 00:00:00.320028 kubelet[3188]: I0425 00:00:00.319967 3188 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40ce18e5-a962-4a82-921b-38508661dc53-whisker-backend-key-pair" pod "40ce18e5-a962-4a82-921b-38508661dc53" (UID: "40ce18e5-a962-4a82-921b-38508661dc53"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:00:00.320178 kubelet[3188]: I0425 00:00:00.320092 3188 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ce18e5-a962-4a82-921b-38508661dc53-kube-api-access-knddk" pod "40ce18e5-a962-4a82-921b-38508661dc53" (UID: "40ce18e5-a962-4a82-921b-38508661dc53"). InnerVolumeSpecName "kube-api-access-knddk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:00:00.320178 kubelet[3188]: I0425 00:00:00.320118 3188 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ce18e5-a962-4a82-921b-38508661dc53-nginx-config" pod "40ce18e5-a962-4a82-921b-38508661dc53" (UID: "40ce18e5-a962-4a82-921b-38508661dc53"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:00.322091 kubelet[3188]: I0425 00:00:00.322041 3188 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ce18e5-a962-4a82-921b-38508661dc53-whisker-ca-bundle" pod "40ce18e5-a962-4a82-921b-38508661dc53" (UID: "40ce18e5-a962-4a82-921b-38508661dc53"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:00.397734 kubelet[3188]: I0425 00:00:00.397682 3188 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/40ce18e5-a962-4a82-921b-38508661dc53-whisker-backend-key-pair\") on node \"ip-172-31-31-110\" DevicePath \"\"" Apr 25 00:00:00.397734 kubelet[3188]: I0425 00:00:00.397722 3188 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-knddk\" (UniqueName: \"kubernetes.io/projected/40ce18e5-a962-4a82-921b-38508661dc53-kube-api-access-knddk\") on node \"ip-172-31-31-110\" DevicePath \"\"" Apr 25 00:00:00.397734 kubelet[3188]: I0425 00:00:00.397736 3188 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/40ce18e5-a962-4a82-921b-38508661dc53-nginx-config\") on node \"ip-172-31-31-110\" DevicePath \"\"" Apr 25 00:00:00.397734 kubelet[3188]: I0425 00:00:00.397747 3188 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40ce18e5-a962-4a82-921b-38508661dc53-whisker-ca-bundle\") on node \"ip-172-31-31-110\" DevicePath \"\"" Apr 25 00:00:01.090305 systemd[1]: run-containerd-runc-k8s.io-19b22f5fcafa74b196e5a6566598d3258ab5724f41bf8ad07aa927cccda220a2-runc.CctJdb.mount: Deactivated successfully. Apr 25 00:00:01.097562 systemd[1]: Removed slice kubepods-besteffort-pod40ce18e5_a962_4a82_921b_38508661dc53.slice - libcontainer container kubepods-besteffort-pod40ce18e5_a962_4a82_921b_38508661dc53.slice. Apr 25 00:00:01.337303 systemd[1]: Created slice kubepods-besteffort-podeb91e8be_9b00_4d67_95fb_456564e752d1.slice - libcontainer container kubepods-besteffort-podeb91e8be_9b00_4d67_95fb_456564e752d1.slice. Apr 25 00:00:01.401353 kubelet[3188]: I0425 00:00:01.400428 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhrqr\" (UniqueName: \"kubernetes.io/projected/eb91e8be-9b00-4d67-95fb-456564e752d1-kube-api-access-qhrqr\") pod \"whisker-86bbf7df67-bhhtx\" (UID: \"eb91e8be-9b00-4d67-95fb-456564e752d1\") " pod="calico-system/whisker-86bbf7df67-bhhtx" Apr 25 00:00:01.405370 kubelet[3188]: I0425 00:00:01.404948 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/eb91e8be-9b00-4d67-95fb-456564e752d1-whisker-backend-key-pair\") pod \"whisker-86bbf7df67-bhhtx\" (UID: \"eb91e8be-9b00-4d67-95fb-456564e752d1\") " pod="calico-system/whisker-86bbf7df67-bhhtx" Apr 25 00:00:01.405370 kubelet[3188]: I0425 00:00:01.405011 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb91e8be-9b00-4d67-95fb-456564e752d1-whisker-ca-bundle\") pod \"whisker-86bbf7df67-bhhtx\" (UID: \"eb91e8be-9b00-4d67-95fb-456564e752d1\") " pod="calico-system/whisker-86bbf7df67-bhhtx" Apr 25 00:00:01.405370 kubelet[3188]: I0425 00:00:01.405106 3188 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/eb91e8be-9b00-4d67-95fb-456564e752d1-nginx-config\") pod \"whisker-86bbf7df67-bhhtx\" (UID: \"eb91e8be-9b00-4d67-95fb-456564e752d1\") " pod="calico-system/whisker-86bbf7df67-bhhtx" Apr 25 00:00:01.586885 kubelet[3188]: I0425 00:00:01.586535 3188 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="40ce18e5-a962-4a82-921b-38508661dc53" path="/var/lib/kubelet/pods/40ce18e5-a962-4a82-921b-38508661dc53/volumes" Apr 25 00:00:01.651530 containerd[1974]: time="2026-04-25T00:00:01.651387495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86bbf7df67-bhhtx,Uid:eb91e8be-9b00-4d67-95fb-456564e752d1,Namespace:calico-system,Attempt:0,}" Apr 25 00:00:02.176822 kernel: calico-node[4717]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Apr 25 00:00:02.664911 systemd-networkd[1892]: cali5c989fdc60a: Link UP Apr 25 00:00:02.666383 systemd-networkd[1892]: cali5c989fdc60a: Gained carrier Apr 25 00:00:02.760413 (udev-worker)[4835]: Network interface NamePolicy= disabled on kernel command line. Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.005 [INFO][4809] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0 whisker-86bbf7df67- calico-system eb91e8be-9b00-4d67-95fb-456564e752d1 910 0 2026-04-25 00:00:01 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:86bbf7df67 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ip-172-31-31-110 whisker-86bbf7df67-bhhtx eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali5c989fdc60a [] [] }} ContainerID="c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" Namespace="calico-system" Pod="whisker-86bbf7df67-bhhtx" WorkloadEndpoint="ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.005 [INFO][4809] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" Namespace="calico-system" Pod="whisker-86bbf7df67-bhhtx" WorkloadEndpoint="ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.215 [INFO][4825] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" HandleID="k8s-pod-network.c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" Workload="ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.254 [INFO][4825] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" HandleID="k8s-pod-network.c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" Workload="ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000123ea0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-31-110", "pod":"whisker-86bbf7df67-bhhtx", "timestamp":"2026-04-25 00:00:02.215920941 +0000 UTC"}, Hostname:"ip-172-31-31-110", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003fb340)} Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.254 [INFO][4825] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.254 [INFO][4825] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.254 [INFO][4825] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-110' Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.279 [INFO][4825] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" host="ip-172-31-31-110" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.365 [INFO][4825] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-31-110" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.391 [INFO][4825] ipam/ipam.go 526: Trying affinity for 192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.400 [INFO][4825] ipam/ipam.go 160: Attempting to load block cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.417 [INFO][4825] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.418 [INFO][4825] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.121.64/26 handle="k8s-pod-network.c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" host="ip-172-31-31-110" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.427 [INFO][4825] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.477 [INFO][4825] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.121.64/26 handle="k8s-pod-network.c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" host="ip-172-31-31-110" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.499 [INFO][4825] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.121.65/26] block=192.168.121.64/26 handle="k8s-pod-network.c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" host="ip-172-31-31-110" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.500 [INFO][4825] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.121.65/26] handle="k8s-pod-network.c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" host="ip-172-31-31-110" Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.500 [INFO][4825] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:02.899378 containerd[1974]: 2026-04-25 00:00:02.500 [INFO][4825] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.121.65/26] IPv6=[] ContainerID="c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" HandleID="k8s-pod-network.c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" Workload="ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0" Apr 25 00:00:02.916565 containerd[1974]: 2026-04-25 00:00:02.504 [INFO][4809] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" Namespace="calico-system" Pod="whisker-86bbf7df67-bhhtx" WorkloadEndpoint="ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0", GenerateName:"whisker-86bbf7df67-", Namespace:"calico-system", SelfLink:"", UID:"eb91e8be-9b00-4d67-95fb-456564e752d1", ResourceVersion:"910", Generation:0, CreationTimestamp:time.Date(2026, time.April, 25, 0, 0, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"86bbf7df67", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"", Pod:"whisker-86bbf7df67-bhhtx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.121.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5c989fdc60a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:02.916565 containerd[1974]: 2026-04-25 00:00:02.505 [INFO][4809] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.65/32] ContainerID="c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" Namespace="calico-system" Pod="whisker-86bbf7df67-bhhtx" WorkloadEndpoint="ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0" Apr 25 00:00:02.916565 containerd[1974]: 2026-04-25 00:00:02.505 [INFO][4809] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5c989fdc60a ContainerID="c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" Namespace="calico-system" Pod="whisker-86bbf7df67-bhhtx" WorkloadEndpoint="ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0" Apr 25 00:00:02.916565 containerd[1974]: 2026-04-25 00:00:02.651 [INFO][4809] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" Namespace="calico-system" Pod="whisker-86bbf7df67-bhhtx" WorkloadEndpoint="ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0" Apr 25 00:00:02.916565 containerd[1974]: 2026-04-25 00:00:02.684 [INFO][4809] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" Namespace="calico-system" Pod="whisker-86bbf7df67-bhhtx" WorkloadEndpoint="ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0", GenerateName:"whisker-86bbf7df67-", Namespace:"calico-system", SelfLink:"", UID:"eb91e8be-9b00-4d67-95fb-456564e752d1", ResourceVersion:"910", Generation:0, CreationTimestamp:time.Date(2026, time.April, 25, 0, 0, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"86bbf7df67", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d", Pod:"whisker-86bbf7df67-bhhtx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.121.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali5c989fdc60a", MAC:"2e:82:f4:fc:29:51", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:02.916565 containerd[1974]: 2026-04-25 00:00:02.804 [INFO][4809] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d" Namespace="calico-system" Pod="whisker-86bbf7df67-bhhtx" WorkloadEndpoint="ip--172--31--31--110-k8s-whisker--86bbf7df67--bhhtx-eth0" Apr 25 00:00:04.128999 systemd-networkd[1892]: cali5c989fdc60a: Gained IPv6LL Apr 25 00:00:04.563134 containerd[1974]: time="2026-04-25T00:00:04.543269598Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 25 00:00:04.563134 containerd[1974]: time="2026-04-25T00:00:04.562899206Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 25 00:00:04.563134 containerd[1974]: time="2026-04-25T00:00:04.562929520Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:04.563134 containerd[1974]: time="2026-04-25T00:00:04.563061739Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:04.729944 systemd[1]: run-containerd-runc-k8s.io-c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d-runc.RgrYBE.mount: Deactivated successfully. Apr 25 00:00:04.738064 systemd[1]: Started cri-containerd-c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d.scope - libcontainer container c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d. Apr 25 00:00:05.028329 systemd-networkd[1892]: vxlan.calico: Link UP Apr 25 00:00:05.028340 systemd-networkd[1892]: vxlan.calico: Gained carrier Apr 25 00:00:05.030522 (udev-worker)[4834]: Network interface NamePolicy= disabled on kernel command line. Apr 25 00:00:05.034058 containerd[1974]: time="2026-04-25T00:00:05.031420901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-86bbf7df67-bhhtx,Uid:eb91e8be-9b00-4d67-95fb-456564e752d1,Namespace:calico-system,Attempt:0,} returns sandbox id \"c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d\"" Apr 25 00:00:05.082913 containerd[1974]: time="2026-04-25T00:00:05.082589601Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Apr 25 00:00:06.882904 systemd-networkd[1892]: vxlan.calico: Gained IPv6LL Apr 25 00:00:09.364726 ntpd[1945]: Listen normally on 8 vxlan.calico 192.168.121.64:123 Apr 25 00:00:09.364859 ntpd[1945]: Listen normally on 9 cali5c989fdc60a [fe80::ecee:eeff:feee:eeee%4]:123 Apr 25 00:00:09.368549 ntpd[1945]: 25 Apr 00:00:09 ntpd[1945]: Listen normally on 8 vxlan.calico 192.168.121.64:123 Apr 25 00:00:09.368549 ntpd[1945]: 25 Apr 00:00:09 ntpd[1945]: Listen normally on 9 cali5c989fdc60a [fe80::ecee:eeff:feee:eeee%4]:123 Apr 25 00:00:09.368549 ntpd[1945]: 25 Apr 00:00:09 ntpd[1945]: Listen normally on 10 vxlan.calico [fe80::64d2:2aff:feb5:1e4e%5]:123 Apr 25 00:00:09.364924 ntpd[1945]: Listen normally on 10 vxlan.calico [fe80::64d2:2aff:feb5:1e4e%5]:123 Apr 25 00:00:10.538606 containerd[1974]: time="2026-04-25T00:00:10.538558283Z" level=info msg="StopPodSandbox for \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\"" Apr 25 00:00:10.540812 containerd[1974]: time="2026-04-25T00:00:10.539558023Z" level=info msg="StopPodSandbox for \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\"" Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.672 [INFO][4991] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.673 [INFO][4991] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" iface="eth0" netns="/var/run/netns/cni-96308f8f-8eaf-701d-c38c-a6f7a20b033f" Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.673 [INFO][4991] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" iface="eth0" netns="/var/run/netns/cni-96308f8f-8eaf-701d-c38c-a6f7a20b033f" Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.674 [INFO][4991] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" iface="eth0" netns="/var/run/netns/cni-96308f8f-8eaf-701d-c38c-a6f7a20b033f" Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.674 [INFO][4991] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.674 [INFO][4991] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.766 [INFO][5014] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" HandleID="k8s-pod-network.bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.766 [INFO][5014] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.766 [INFO][5014] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.778 [WARNING][5014] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" HandleID="k8s-pod-network.bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.778 [INFO][5014] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" HandleID="k8s-pod-network.bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.782 [INFO][5014] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:10.790696 containerd[1974]: 2026-04-25 00:00:10.785 [INFO][4991] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:10.790696 containerd[1974]: time="2026-04-25T00:00:10.790342576Z" level=info msg="TearDown network for sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\" successfully" Apr 25 00:00:10.790696 containerd[1974]: time="2026-04-25T00:00:10.790379045Z" level=info msg="StopPodSandbox for \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\" returns successfully" Apr 25 00:00:10.799321 systemd[1]: run-netns-cni\x2d96308f8f\x2d8eaf\x2d701d\x2dc38c\x2da6f7a20b033f.mount: Deactivated successfully. Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.660 [INFO][4992] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.660 [INFO][4992] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" iface="eth0" netns="/var/run/netns/cni-fd71d1d6-271d-e4f7-57cf-96f5900e5a42" Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.661 [INFO][4992] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" iface="eth0" netns="/var/run/netns/cni-fd71d1d6-271d-e4f7-57cf-96f5900e5a42" Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.661 [INFO][4992] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" iface="eth0" netns="/var/run/netns/cni-fd71d1d6-271d-e4f7-57cf-96f5900e5a42" Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.661 [INFO][4992] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.661 [INFO][4992] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.769 [INFO][5011] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" HandleID="k8s-pod-network.a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.769 [INFO][5011] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.781 [INFO][5011] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.792 [WARNING][5011] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" HandleID="k8s-pod-network.a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.792 [INFO][5011] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" HandleID="k8s-pod-network.a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.798 [INFO][5011] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:10.807619 containerd[1974]: 2026-04-25 00:00:10.804 [INFO][4992] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:10.808404 containerd[1974]: time="2026-04-25T00:00:10.807753567Z" level=info msg="TearDown network for sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\" successfully" Apr 25 00:00:10.808404 containerd[1974]: time="2026-04-25T00:00:10.807855570Z" level=info msg="StopPodSandbox for \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\" returns successfully" Apr 25 00:00:10.814413 systemd[1]: run-netns-cni\x2dfd71d1d6\x2d271d\x2de4f7\x2d57cf\x2d96f5900e5a42.mount: Deactivated successfully. Apr 25 00:00:10.818197 containerd[1974]: time="2026-04-25T00:00:10.818153936Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-q5wxg,Uid:8259ae9f-09c1-45c5-a26a-9fb19e805b35,Namespace:kube-system,Attempt:1,}" Apr 25 00:00:10.822324 containerd[1974]: time="2026-04-25T00:00:10.822266644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-vg777,Uid:b17085d6-14a5-4b5e-81f9-314340899d1e,Namespace:calico-system,Attempt:1,}" Apr 25 00:00:11.198915 systemd-networkd[1892]: cali0ccb425bc32: Link UP Apr 25 00:00:11.202265 systemd-networkd[1892]: cali0ccb425bc32: Gained carrier Apr 25 00:00:11.205479 (udev-worker)[5068]: Network interface NamePolicy= disabled on kernel command line. Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.023 [INFO][5026] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0 goldmane-9f7667bb8- calico-system b17085d6-14a5-4b5e-81f9-314340899d1e 938 0 2026-04-24 23:59:34 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ip-172-31-31-110 goldmane-9f7667bb8-vg777 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali0ccb425bc32 [] [] }} ContainerID="8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" Namespace="calico-system" Pod="goldmane-9f7667bb8-vg777" WorkloadEndpoint="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.023 [INFO][5026] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" Namespace="calico-system" Pod="goldmane-9f7667bb8-vg777" WorkloadEndpoint="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.097 [INFO][5053] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" HandleID="k8s-pod-network.8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.111 [INFO][5053] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" HandleID="k8s-pod-network.8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ef8a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-31-110", "pod":"goldmane-9f7667bb8-vg777", "timestamp":"2026-04-25 00:00:11.0972546 +0000 UTC"}, Hostname:"ip-172-31-31-110", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000202f20)} Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.111 [INFO][5053] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.111 [INFO][5053] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.111 [INFO][5053] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-110' Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.123 [INFO][5053] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" host="ip-172-31-31-110" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.138 [INFO][5053] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-31-110" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.151 [INFO][5053] ipam/ipam.go 526: Trying affinity for 192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.153 [INFO][5053] ipam/ipam.go 160: Attempting to load block cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.157 [INFO][5053] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.157 [INFO][5053] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.121.64/26 handle="k8s-pod-network.8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" host="ip-172-31-31-110" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.161 [INFO][5053] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76 Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.172 [INFO][5053] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.121.64/26 handle="k8s-pod-network.8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" host="ip-172-31-31-110" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.185 [INFO][5053] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.121.66/26] block=192.168.121.64/26 handle="k8s-pod-network.8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" host="ip-172-31-31-110" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.186 [INFO][5053] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.121.66/26] handle="k8s-pod-network.8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" host="ip-172-31-31-110" Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.186 [INFO][5053] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:11.235982 containerd[1974]: 2026-04-25 00:00:11.187 [INFO][5053] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.121.66/26] IPv6=[] ContainerID="8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" HandleID="k8s-pod-network.8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:11.237995 containerd[1974]: 2026-04-25 00:00:11.193 [INFO][5026] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" Namespace="calico-system" Pod="goldmane-9f7667bb8-vg777" WorkloadEndpoint="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"b17085d6-14a5-4b5e-81f9-314340899d1e", ResourceVersion:"938", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"", Pod:"goldmane-9f7667bb8-vg777", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.121.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0ccb425bc32", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:11.237995 containerd[1974]: 2026-04-25 00:00:11.193 [INFO][5026] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.66/32] ContainerID="8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" Namespace="calico-system" Pod="goldmane-9f7667bb8-vg777" WorkloadEndpoint="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:11.237995 containerd[1974]: 2026-04-25 00:00:11.193 [INFO][5026] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0ccb425bc32 ContainerID="8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" Namespace="calico-system" Pod="goldmane-9f7667bb8-vg777" WorkloadEndpoint="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:11.237995 containerd[1974]: 2026-04-25 00:00:11.201 [INFO][5026] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" Namespace="calico-system" Pod="goldmane-9f7667bb8-vg777" WorkloadEndpoint="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:11.237995 containerd[1974]: 2026-04-25 00:00:11.205 [INFO][5026] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" Namespace="calico-system" Pod="goldmane-9f7667bb8-vg777" WorkloadEndpoint="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"b17085d6-14a5-4b5e-81f9-314340899d1e", ResourceVersion:"938", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76", Pod:"goldmane-9f7667bb8-vg777", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.121.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0ccb425bc32", MAC:"2e:89:c4:33:0a:13", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:11.237995 containerd[1974]: 2026-04-25 00:00:11.226 [INFO][5026] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76" Namespace="calico-system" Pod="goldmane-9f7667bb8-vg777" WorkloadEndpoint="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:11.351281 systemd-networkd[1892]: calib407ef0a420: Link UP Apr 25 00:00:11.354257 systemd-networkd[1892]: calib407ef0a420: Gained carrier Apr 25 00:00:11.394940 containerd[1974]: time="2026-04-25T00:00:11.394154795Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 25 00:00:11.394940 containerd[1974]: time="2026-04-25T00:00:11.394230484Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 25 00:00:11.394940 containerd[1974]: time="2026-04-25T00:00:11.394254047Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:11.394940 containerd[1974]: time="2026-04-25T00:00:11.394368675Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.037 [INFO][5041] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0 coredns-7d764666f9- kube-system 8259ae9f-09c1-45c5-a26a-9fb19e805b35 939 0 2026-04-24 23:59:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-31-110 coredns-7d764666f9-q5wxg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib407ef0a420 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" Namespace="kube-system" Pod="coredns-7d764666f9-q5wxg" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.038 [INFO][5041] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" Namespace="kube-system" Pod="coredns-7d764666f9-q5wxg" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.134 [INFO][5058] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" HandleID="k8s-pod-network.595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.150 [INFO][5058] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" HandleID="k8s-pod-network.595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fea0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-31-110", "pod":"coredns-7d764666f9-q5wxg", "timestamp":"2026-04-25 00:00:11.134917366 +0000 UTC"}, Hostname:"ip-172-31-31-110", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003a5080)} Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.150 [INFO][5058] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.186 [INFO][5058] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.186 [INFO][5058] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-110' Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.220 [INFO][5058] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" host="ip-172-31-31-110" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.239 [INFO][5058] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-31-110" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.257 [INFO][5058] ipam/ipam.go 526: Trying affinity for 192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.262 [INFO][5058] ipam/ipam.go 160: Attempting to load block cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.268 [INFO][5058] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.268 [INFO][5058] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.121.64/26 handle="k8s-pod-network.595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" host="ip-172-31-31-110" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.276 [INFO][5058] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889 Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.298 [INFO][5058] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.121.64/26 handle="k8s-pod-network.595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" host="ip-172-31-31-110" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.312 [INFO][5058] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.121.67/26] block=192.168.121.64/26 handle="k8s-pod-network.595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" host="ip-172-31-31-110" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.312 [INFO][5058] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.121.67/26] handle="k8s-pod-network.595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" host="ip-172-31-31-110" Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.312 [INFO][5058] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:11.396072 containerd[1974]: 2026-04-25 00:00:11.312 [INFO][5058] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.121.67/26] IPv6=[] ContainerID="595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" HandleID="k8s-pod-network.595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:11.398061 containerd[1974]: 2026-04-25 00:00:11.325 [INFO][5041] cni-plugin/k8s.go 418: Populated endpoint ContainerID="595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" Namespace="kube-system" Pod="coredns-7d764666f9-q5wxg" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"8259ae9f-09c1-45c5-a26a-9fb19e805b35", ResourceVersion:"939", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"", Pod:"coredns-7d764666f9-q5wxg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib407ef0a420", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:11.398061 containerd[1974]: 2026-04-25 00:00:11.325 [INFO][5041] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.67/32] ContainerID="595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" Namespace="kube-system" Pod="coredns-7d764666f9-q5wxg" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:11.398061 containerd[1974]: 2026-04-25 00:00:11.325 [INFO][5041] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib407ef0a420 ContainerID="595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" Namespace="kube-system" Pod="coredns-7d764666f9-q5wxg" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:11.398061 containerd[1974]: 2026-04-25 00:00:11.356 [INFO][5041] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" Namespace="kube-system" Pod="coredns-7d764666f9-q5wxg" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:11.398061 containerd[1974]: 2026-04-25 00:00:11.358 [INFO][5041] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" Namespace="kube-system" Pod="coredns-7d764666f9-q5wxg" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"8259ae9f-09c1-45c5-a26a-9fb19e805b35", ResourceVersion:"939", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889", Pod:"coredns-7d764666f9-q5wxg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib407ef0a420", MAC:"2a:a9:63:a1:8a:15", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:11.398061 containerd[1974]: 2026-04-25 00:00:11.386 [INFO][5041] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889" Namespace="kube-system" Pod="coredns-7d764666f9-q5wxg" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:11.462990 systemd[1]: Started cri-containerd-8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76.scope - libcontainer container 8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76. Apr 25 00:00:11.501398 containerd[1974]: time="2026-04-25T00:00:11.501284919Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 25 00:00:11.501566 containerd[1974]: time="2026-04-25T00:00:11.501368975Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 25 00:00:11.501566 containerd[1974]: time="2026-04-25T00:00:11.501390748Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:11.501566 containerd[1974]: time="2026-04-25T00:00:11.501506127Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:11.527405 containerd[1974]: time="2026-04-25T00:00:11.527284941Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Apr 25 00:00:11.551048 systemd[1]: Started cri-containerd-595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889.scope - libcontainer container 595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889. Apr 25 00:00:11.560829 containerd[1974]: time="2026-04-25T00:00:11.559149830Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:11.560829 containerd[1974]: time="2026-04-25T00:00:11.559387234Z" level=info msg="StopPodSandbox for \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\"" Apr 25 00:00:11.561669 containerd[1974]: time="2026-04-25T00:00:11.561640327Z" level=info msg="StopPodSandbox for \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\"" Apr 25 00:00:11.574370 containerd[1974]: time="2026-04-25T00:00:11.572647748Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:11.576561 containerd[1974]: time="2026-04-25T00:00:11.576340944Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 6.49367254s" Apr 25 00:00:11.583150 containerd[1974]: time="2026-04-25T00:00:11.583097509Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Apr 25 00:00:11.583982 containerd[1974]: time="2026-04-25T00:00:11.577235530Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:11.605635 containerd[1974]: time="2026-04-25T00:00:11.605481028Z" level=info msg="CreateContainer within sandbox \"c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Apr 25 00:00:11.637815 containerd[1974]: time="2026-04-25T00:00:11.637753938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-vg777,Uid:b17085d6-14a5-4b5e-81f9-314340899d1e,Namespace:calico-system,Attempt:1,} returns sandbox id \"8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76\"" Apr 25 00:00:11.643382 containerd[1974]: time="2026-04-25T00:00:11.643138593Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Apr 25 00:00:11.655669 containerd[1974]: time="2026-04-25T00:00:11.655625218Z" level=info msg="CreateContainer within sandbox \"c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"1a852a2881083c9f27d94e50fdadfc69f2e2fb638ab53ee6c4a0ab2cdec5da0e\"" Apr 25 00:00:11.659051 containerd[1974]: time="2026-04-25T00:00:11.658457936Z" level=info msg="StartContainer for \"1a852a2881083c9f27d94e50fdadfc69f2e2fb638ab53ee6c4a0ab2cdec5da0e\"" Apr 25 00:00:11.703484 containerd[1974]: time="2026-04-25T00:00:11.700255284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-q5wxg,Uid:8259ae9f-09c1-45c5-a26a-9fb19e805b35,Namespace:kube-system,Attempt:1,} returns sandbox id \"595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889\"" Apr 25 00:00:11.710565 containerd[1974]: time="2026-04-25T00:00:11.710513519Z" level=info msg="CreateContainer within sandbox \"595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 25 00:00:11.755205 systemd[1]: Started cri-containerd-1a852a2881083c9f27d94e50fdadfc69f2e2fb638ab53ee6c4a0ab2cdec5da0e.scope - libcontainer container 1a852a2881083c9f27d94e50fdadfc69f2e2fb638ab53ee6c4a0ab2cdec5da0e. Apr 25 00:00:11.787088 containerd[1974]: time="2026-04-25T00:00:11.787032425Z" level=info msg="CreateContainer within sandbox \"595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5f8903a457712ab9e791f00cc6266f2fb49e4370a96fff8fd92d2ebb8f9283ec\"" Apr 25 00:00:11.789673 containerd[1974]: time="2026-04-25T00:00:11.789638681Z" level=info msg="StartContainer for \"5f8903a457712ab9e791f00cc6266f2fb49e4370a96fff8fd92d2ebb8f9283ec\"" Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.763 [INFO][5199] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.764 [INFO][5199] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" iface="eth0" netns="/var/run/netns/cni-5f04566f-2c06-4e24-69eb-bc795e03ea16" Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.764 [INFO][5199] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" iface="eth0" netns="/var/run/netns/cni-5f04566f-2c06-4e24-69eb-bc795e03ea16" Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.765 [INFO][5199] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" iface="eth0" netns="/var/run/netns/cni-5f04566f-2c06-4e24-69eb-bc795e03ea16" Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.765 [INFO][5199] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.765 [INFO][5199] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.854 [INFO][5242] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" HandleID="k8s-pod-network.2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.855 [INFO][5242] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.855 [INFO][5242] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.877 [WARNING][5242] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" HandleID="k8s-pod-network.2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.877 [INFO][5242] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" HandleID="k8s-pod-network.2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.880 [INFO][5242] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:11.894206 containerd[1974]: 2026-04-25 00:00:11.884 [INFO][5199] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:11.906492 systemd[1]: run-netns-cni\x2d5f04566f\x2d2c06\x2d4e24\x2d69eb\x2dbc795e03ea16.mount: Deactivated successfully. Apr 25 00:00:11.919100 containerd[1974]: time="2026-04-25T00:00:11.916316977Z" level=info msg="StartContainer for \"1a852a2881083c9f27d94e50fdadfc69f2e2fb638ab53ee6c4a0ab2cdec5da0e\" returns successfully" Apr 25 00:00:11.921674 containerd[1974]: time="2026-04-25T00:00:11.921629487Z" level=info msg="TearDown network for sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\" successfully" Apr 25 00:00:11.921674 containerd[1974]: time="2026-04-25T00:00:11.921670126Z" level=info msg="StopPodSandbox for \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\" returns successfully" Apr 25 00:00:11.923064 systemd[1]: Started cri-containerd-5f8903a457712ab9e791f00cc6266f2fb49e4370a96fff8fd92d2ebb8f9283ec.scope - libcontainer container 5f8903a457712ab9e791f00cc6266f2fb49e4370a96fff8fd92d2ebb8f9283ec. Apr 25 00:00:11.928245 containerd[1974]: time="2026-04-25T00:00:11.928207712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wllpb,Uid:fafabee1-df27-491d-a48c-611faa0cd932,Namespace:calico-system,Attempt:1,}" Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.763 [INFO][5192] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.764 [INFO][5192] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" iface="eth0" netns="/var/run/netns/cni-d741d7e3-dbce-24d2-df7f-9992acf255fc" Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.766 [INFO][5192] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" iface="eth0" netns="/var/run/netns/cni-d741d7e3-dbce-24d2-df7f-9992acf255fc" Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.767 [INFO][5192] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" iface="eth0" netns="/var/run/netns/cni-d741d7e3-dbce-24d2-df7f-9992acf255fc" Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.768 [INFO][5192] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.768 [INFO][5192] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.878 [INFO][5247] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" HandleID="k8s-pod-network.2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.879 [INFO][5247] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.882 [INFO][5247] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.922 [WARNING][5247] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" HandleID="k8s-pod-network.2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.922 [INFO][5247] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" HandleID="k8s-pod-network.2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.928 [INFO][5247] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:11.939135 containerd[1974]: 2026-04-25 00:00:11.935 [INFO][5192] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:11.941102 containerd[1974]: time="2026-04-25T00:00:11.940646622Z" level=info msg="TearDown network for sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\" successfully" Apr 25 00:00:11.941102 containerd[1974]: time="2026-04-25T00:00:11.940690724Z" level=info msg="StopPodSandbox for \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\" returns successfully" Apr 25 00:00:11.948746 systemd[1]: run-netns-cni\x2dd741d7e3\x2ddbce\x2d24d2\x2ddf7f\x2d9992acf255fc.mount: Deactivated successfully. Apr 25 00:00:11.955829 containerd[1974]: time="2026-04-25T00:00:11.955539068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zk545,Uid:651a4b24-dccb-49b4-b5f3-a4291d6e49f0,Namespace:kube-system,Attempt:1,}" Apr 25 00:00:12.036143 containerd[1974]: time="2026-04-25T00:00:12.036028334Z" level=info msg="StartContainer for \"5f8903a457712ab9e791f00cc6266f2fb49e4370a96fff8fd92d2ebb8f9283ec\" returns successfully" Apr 25 00:00:12.217644 systemd-networkd[1892]: cali7235609975d: Link UP Apr 25 00:00:12.219320 systemd-networkd[1892]: cali7235609975d: Gained carrier Apr 25 00:00:12.253514 kubelet[3188]: I0425 00:00:12.253423 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-q5wxg" podStartSLOduration=49.25340178 podStartE2EDuration="49.25340178s" podCreationTimestamp="2026-04-24 23:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:00:12.093667889 +0000 UTC m=+54.762796707" watchObservedRunningTime="2026-04-25 00:00:12.25340178 +0000 UTC m=+54.922530522" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.033 [INFO][5290] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0 csi-node-driver- calico-system fafabee1-df27-491d-a48c-611faa0cd932 954 0 2026-04-24 23:59:35 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-31-110 csi-node-driver-wllpb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali7235609975d [] [] }} ContainerID="24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" Namespace="calico-system" Pod="csi-node-driver-wllpb" WorkloadEndpoint="ip--172--31--31--110-k8s-csi--node--driver--wllpb-" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.033 [INFO][5290] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" Namespace="calico-system" Pod="csi-node-driver-wllpb" WorkloadEndpoint="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.142 [INFO][5322] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" HandleID="k8s-pod-network.24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.155 [INFO][5322] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" HandleID="k8s-pod-network.24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003b94b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-31-110", "pod":"csi-node-driver-wllpb", "timestamp":"2026-04-25 00:00:12.142897263 +0000 UTC"}, Hostname:"ip-172-31-31-110", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0004c4000)} Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.155 [INFO][5322] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.155 [INFO][5322] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.155 [INFO][5322] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-110' Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.158 [INFO][5322] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" host="ip-172-31-31-110" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.164 [INFO][5322] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-31-110" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.171 [INFO][5322] ipam/ipam.go 526: Trying affinity for 192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.175 [INFO][5322] ipam/ipam.go 160: Attempting to load block cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.182 [INFO][5322] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.183 [INFO][5322] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.121.64/26 handle="k8s-pod-network.24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" host="ip-172-31-31-110" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.185 [INFO][5322] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567 Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.194 [INFO][5322] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.121.64/26 handle="k8s-pod-network.24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" host="ip-172-31-31-110" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.205 [INFO][5322] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.121.68/26] block=192.168.121.64/26 handle="k8s-pod-network.24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" host="ip-172-31-31-110" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.205 [INFO][5322] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.121.68/26] handle="k8s-pod-network.24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" host="ip-172-31-31-110" Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.205 [INFO][5322] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:12.257307 containerd[1974]: 2026-04-25 00:00:12.205 [INFO][5322] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.121.68/26] IPv6=[] ContainerID="24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" HandleID="k8s-pod-network.24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:12.259563 containerd[1974]: 2026-04-25 00:00:12.209 [INFO][5290] cni-plugin/k8s.go 418: Populated endpoint ContainerID="24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" Namespace="calico-system" Pod="csi-node-driver-wllpb" WorkloadEndpoint="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fafabee1-df27-491d-a48c-611faa0cd932", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"", Pod:"csi-node-driver-wllpb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.121.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7235609975d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:12.259563 containerd[1974]: 2026-04-25 00:00:12.210 [INFO][5290] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.68/32] ContainerID="24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" Namespace="calico-system" Pod="csi-node-driver-wllpb" WorkloadEndpoint="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:12.259563 containerd[1974]: 2026-04-25 00:00:12.210 [INFO][5290] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7235609975d ContainerID="24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" Namespace="calico-system" Pod="csi-node-driver-wllpb" WorkloadEndpoint="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:12.259563 containerd[1974]: 2026-04-25 00:00:12.225 [INFO][5290] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" Namespace="calico-system" Pod="csi-node-driver-wllpb" WorkloadEndpoint="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:12.259563 containerd[1974]: 2026-04-25 00:00:12.229 [INFO][5290] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" Namespace="calico-system" Pod="csi-node-driver-wllpb" WorkloadEndpoint="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fafabee1-df27-491d-a48c-611faa0cd932", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567", Pod:"csi-node-driver-wllpb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.121.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7235609975d", MAC:"4a:ec:b1:67:b3:db", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:12.259563 containerd[1974]: 2026-04-25 00:00:12.253 [INFO][5290] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567" Namespace="calico-system" Pod="csi-node-driver-wllpb" WorkloadEndpoint="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:12.302409 containerd[1974]: time="2026-04-25T00:00:12.302195340Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 25 00:00:12.302409 containerd[1974]: time="2026-04-25T00:00:12.302337140Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 25 00:00:12.302700 containerd[1974]: time="2026-04-25T00:00:12.302397218Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:12.302700 containerd[1974]: time="2026-04-25T00:00:12.302540474Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:12.346452 systemd[1]: Started cri-containerd-24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567.scope - libcontainer container 24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567. Apr 25 00:00:12.359241 systemd-networkd[1892]: cali4fd2cafc68b: Link UP Apr 25 00:00:12.360294 systemd-networkd[1892]: cali4fd2cafc68b: Gained carrier Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.140 [INFO][5312] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0 coredns-7d764666f9- kube-system 651a4b24-dccb-49b4-b5f3-a4291d6e49f0 955 0 2026-04-24 23:59:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-31-110 coredns-7d764666f9-zk545 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4fd2cafc68b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" Namespace="kube-system" Pod="coredns-7d764666f9-zk545" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.140 [INFO][5312] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" Namespace="kube-system" Pod="coredns-7d764666f9-zk545" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.203 [INFO][5333] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" HandleID="k8s-pod-network.6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.235 [INFO][5333] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" HandleID="k8s-pod-network.6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fdaf0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-31-110", "pod":"coredns-7d764666f9-zk545", "timestamp":"2026-04-25 00:00:12.203471082 +0000 UTC"}, Hostname:"ip-172-31-31-110", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002eb1e0)} Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.236 [INFO][5333] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.236 [INFO][5333] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.236 [INFO][5333] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-110' Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.260 [INFO][5333] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" host="ip-172-31-31-110" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.269 [INFO][5333] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-31-110" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.281 [INFO][5333] ipam/ipam.go 526: Trying affinity for 192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.292 [INFO][5333] ipam/ipam.go 160: Attempting to load block cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.299 [INFO][5333] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.299 [INFO][5333] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.121.64/26 handle="k8s-pod-network.6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" host="ip-172-31-31-110" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.303 [INFO][5333] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.324 [INFO][5333] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.121.64/26 handle="k8s-pod-network.6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" host="ip-172-31-31-110" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.343 [INFO][5333] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.121.69/26] block=192.168.121.64/26 handle="k8s-pod-network.6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" host="ip-172-31-31-110" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.343 [INFO][5333] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.121.69/26] handle="k8s-pod-network.6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" host="ip-172-31-31-110" Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.343 [INFO][5333] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:12.391042 containerd[1974]: 2026-04-25 00:00:12.343 [INFO][5333] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.121.69/26] IPv6=[] ContainerID="6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" HandleID="k8s-pod-network.6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:12.392139 containerd[1974]: 2026-04-25 00:00:12.352 [INFO][5312] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" Namespace="kube-system" Pod="coredns-7d764666f9-zk545" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"651a4b24-dccb-49b4-b5f3-a4291d6e49f0", ResourceVersion:"955", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"", Pod:"coredns-7d764666f9-zk545", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4fd2cafc68b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:12.392139 containerd[1974]: 2026-04-25 00:00:12.353 [INFO][5312] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.69/32] ContainerID="6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" Namespace="kube-system" Pod="coredns-7d764666f9-zk545" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:12.392139 containerd[1974]: 2026-04-25 00:00:12.354 [INFO][5312] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4fd2cafc68b ContainerID="6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" Namespace="kube-system" Pod="coredns-7d764666f9-zk545" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:12.392139 containerd[1974]: 2026-04-25 00:00:12.361 [INFO][5312] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" Namespace="kube-system" Pod="coredns-7d764666f9-zk545" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:12.392139 containerd[1974]: 2026-04-25 00:00:12.362 [INFO][5312] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" Namespace="kube-system" Pod="coredns-7d764666f9-zk545" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"651a4b24-dccb-49b4-b5f3-a4291d6e49f0", ResourceVersion:"955", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d", Pod:"coredns-7d764666f9-zk545", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4fd2cafc68b", MAC:"26:b7:99:62:9d:56", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:12.392139 containerd[1974]: 2026-04-25 00:00:12.384 [INFO][5312] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d" Namespace="kube-system" Pod="coredns-7d764666f9-zk545" WorkloadEndpoint="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:12.452220 containerd[1974]: time="2026-04-25T00:00:12.452098364Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 25 00:00:12.452395 containerd[1974]: time="2026-04-25T00:00:12.452282041Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 25 00:00:12.453565 containerd[1974]: time="2026-04-25T00:00:12.452363290Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:12.453565 containerd[1974]: time="2026-04-25T00:00:12.452759612Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:12.462741 containerd[1974]: time="2026-04-25T00:00:12.462664868Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-wllpb,Uid:fafabee1-df27-491d-a48c-611faa0cd932,Namespace:calico-system,Attempt:1,} returns sandbox id \"24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567\"" Apr 25 00:00:12.488026 systemd[1]: Started cri-containerd-6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d.scope - libcontainer container 6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d. Apr 25 00:00:12.532407 containerd[1974]: time="2026-04-25T00:00:12.532364054Z" level=info msg="StopPodSandbox for \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\"" Apr 25 00:00:12.563256 containerd[1974]: time="2026-04-25T00:00:12.562245740Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-zk545,Uid:651a4b24-dccb-49b4-b5f3-a4291d6e49f0,Namespace:kube-system,Attempt:1,} returns sandbox id \"6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d\"" Apr 25 00:00:12.580618 containerd[1974]: time="2026-04-25T00:00:12.579868454Z" level=info msg="CreateContainer within sandbox \"6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 25 00:00:12.629574 containerd[1974]: time="2026-04-25T00:00:12.628709719Z" level=info msg="CreateContainer within sandbox \"6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b582797d28b1eabefaad36b81890874f19b78bfc863304e37bd46b4b6e10c13f\"" Apr 25 00:00:12.632888 containerd[1974]: time="2026-04-25T00:00:12.632665814Z" level=info msg="StartContainer for \"b582797d28b1eabefaad36b81890874f19b78bfc863304e37bd46b4b6e10c13f\"" Apr 25 00:00:12.641175 systemd-networkd[1892]: cali0ccb425bc32: Gained IPv6LL Apr 25 00:00:12.702333 systemd[1]: Started cri-containerd-b582797d28b1eabefaad36b81890874f19b78bfc863304e37bd46b4b6e10c13f.scope - libcontainer container b582797d28b1eabefaad36b81890874f19b78bfc863304e37bd46b4b6e10c13f. Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.667 [INFO][5458] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.668 [INFO][5458] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" iface="eth0" netns="/var/run/netns/cni-3b1b7159-113f-c5c5-9689-70740eec60de" Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.668 [INFO][5458] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" iface="eth0" netns="/var/run/netns/cni-3b1b7159-113f-c5c5-9689-70740eec60de" Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.669 [INFO][5458] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" iface="eth0" netns="/var/run/netns/cni-3b1b7159-113f-c5c5-9689-70740eec60de" Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.669 [INFO][5458] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.669 [INFO][5458] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.739 [INFO][5485] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" HandleID="k8s-pod-network.3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.740 [INFO][5485] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.740 [INFO][5485] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.749 [WARNING][5485] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" HandleID="k8s-pod-network.3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.749 [INFO][5485] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" HandleID="k8s-pod-network.3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.751 [INFO][5485] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:12.757051 containerd[1974]: 2026-04-25 00:00:12.754 [INFO][5458] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:12.758358 containerd[1974]: time="2026-04-25T00:00:12.757122623Z" level=info msg="TearDown network for sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\" successfully" Apr 25 00:00:12.758358 containerd[1974]: time="2026-04-25T00:00:12.757175977Z" level=info msg="StopPodSandbox for \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\" returns successfully" Apr 25 00:00:12.764006 containerd[1974]: time="2026-04-25T00:00:12.763446451Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848b4486d7-dvwgq,Uid:d9ba9417-7859-4ec0-8a74-659edbbec7c4,Namespace:calico-system,Attempt:1,}" Apr 25 00:00:12.777881 containerd[1974]: time="2026-04-25T00:00:12.777750786Z" level=info msg="StartContainer for \"b582797d28b1eabefaad36b81890874f19b78bfc863304e37bd46b4b6e10c13f\" returns successfully" Apr 25 00:00:12.824712 systemd[1]: run-netns-cni\x2d3b1b7159\x2d113f\x2dc5c5\x2d9689\x2d70740eec60de.mount: Deactivated successfully. Apr 25 00:00:12.948553 systemd-networkd[1892]: cali86641ffeb98: Link UP Apr 25 00:00:12.950385 systemd-networkd[1892]: cali86641ffeb98: Gained carrier Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.844 [INFO][5521] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0 calico-apiserver-848b4486d7- calico-system d9ba9417-7859-4ec0-8a74-659edbbec7c4 975 0 2026-04-24 23:59:34 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:848b4486d7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-31-110 calico-apiserver-848b4486d7-dvwgq eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali86641ffeb98 [] [] }} ContainerID="7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-dvwgq" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.844 [INFO][5521] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-dvwgq" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.893 [INFO][5532] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" HandleID="k8s-pod-network.7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.903 [INFO][5532] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" HandleID="k8s-pod-network.7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd860), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-31-110", "pod":"calico-apiserver-848b4486d7-dvwgq", "timestamp":"2026-04-25 00:00:12.893058068 +0000 UTC"}, Hostname:"ip-172-31-31-110", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002dba20)} Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.903 [INFO][5532] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.903 [INFO][5532] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.903 [INFO][5532] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-110' Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.906 [INFO][5532] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" host="ip-172-31-31-110" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.912 [INFO][5532] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-31-110" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.917 [INFO][5532] ipam/ipam.go 526: Trying affinity for 192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.919 [INFO][5532] ipam/ipam.go 160: Attempting to load block cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.922 [INFO][5532] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.922 [INFO][5532] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.121.64/26 handle="k8s-pod-network.7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" host="ip-172-31-31-110" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.924 [INFO][5532] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.929 [INFO][5532] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.121.64/26 handle="k8s-pod-network.7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" host="ip-172-31-31-110" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.940 [INFO][5532] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.121.70/26] block=192.168.121.64/26 handle="k8s-pod-network.7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" host="ip-172-31-31-110" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.941 [INFO][5532] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.121.70/26] handle="k8s-pod-network.7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" host="ip-172-31-31-110" Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.941 [INFO][5532] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:12.978710 containerd[1974]: 2026-04-25 00:00:12.941 [INFO][5532] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.121.70/26] IPv6=[] ContainerID="7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" HandleID="k8s-pod-network.7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:12.981997 containerd[1974]: 2026-04-25 00:00:12.943 [INFO][5521] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-dvwgq" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0", GenerateName:"calico-apiserver-848b4486d7-", Namespace:"calico-system", SelfLink:"", UID:"d9ba9417-7859-4ec0-8a74-659edbbec7c4", ResourceVersion:"975", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848b4486d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"", Pod:"calico-apiserver-848b4486d7-dvwgq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali86641ffeb98", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:12.981997 containerd[1974]: 2026-04-25 00:00:12.943 [INFO][5521] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.70/32] ContainerID="7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-dvwgq" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:12.981997 containerd[1974]: 2026-04-25 00:00:12.943 [INFO][5521] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali86641ffeb98 ContainerID="7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-dvwgq" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:12.981997 containerd[1974]: 2026-04-25 00:00:12.951 [INFO][5521] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-dvwgq" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:12.981997 containerd[1974]: 2026-04-25 00:00:12.953 [INFO][5521] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-dvwgq" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0", GenerateName:"calico-apiserver-848b4486d7-", Namespace:"calico-system", SelfLink:"", UID:"d9ba9417-7859-4ec0-8a74-659edbbec7c4", ResourceVersion:"975", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848b4486d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e", Pod:"calico-apiserver-848b4486d7-dvwgq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali86641ffeb98", MAC:"4a:50:25:f8:1e:e8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:12.981997 containerd[1974]: 2026-04-25 00:00:12.974 [INFO][5521] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-dvwgq" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:13.020979 containerd[1974]: time="2026-04-25T00:00:13.020085305Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 25 00:00:13.020979 containerd[1974]: time="2026-04-25T00:00:13.020176643Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 25 00:00:13.020979 containerd[1974]: time="2026-04-25T00:00:13.020201387Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:13.020979 containerd[1974]: time="2026-04-25T00:00:13.020315461Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:13.070286 systemd[1]: Started cri-containerd-7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e.scope - libcontainer container 7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e. Apr 25 00:00:13.136175 kubelet[3188]: I0425 00:00:13.134570 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-zk545" podStartSLOduration=50.134546294 podStartE2EDuration="50.134546294s" podCreationTimestamp="2026-04-24 23:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:00:13.112017611 +0000 UTC m=+55.781146354" watchObservedRunningTime="2026-04-25 00:00:13.134546294 +0000 UTC m=+55.803675037" Apr 25 00:00:13.155686 systemd-networkd[1892]: calib407ef0a420: Gained IPv6LL Apr 25 00:00:13.225204 containerd[1974]: time="2026-04-25T00:00:13.223887219Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848b4486d7-dvwgq,Uid:d9ba9417-7859-4ec0-8a74-659edbbec7c4,Namespace:calico-system,Attempt:1,} returns sandbox id \"7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e\"" Apr 25 00:00:13.409100 systemd-networkd[1892]: cali7235609975d: Gained IPv6LL Apr 25 00:00:13.533531 containerd[1974]: time="2026-04-25T00:00:13.533473604Z" level=info msg="StopPodSandbox for \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\"" Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.616 [INFO][5622] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.616 [INFO][5622] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" iface="eth0" netns="/var/run/netns/cni-4ef99837-2801-f4e5-6e04-34bd7391181b" Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.617 [INFO][5622] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" iface="eth0" netns="/var/run/netns/cni-4ef99837-2801-f4e5-6e04-34bd7391181b" Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.617 [INFO][5622] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" iface="eth0" netns="/var/run/netns/cni-4ef99837-2801-f4e5-6e04-34bd7391181b" Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.617 [INFO][5622] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.617 [INFO][5622] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.687 [INFO][5630] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" HandleID="k8s-pod-network.6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.688 [INFO][5630] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.688 [INFO][5630] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.698 [WARNING][5630] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" HandleID="k8s-pod-network.6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.698 [INFO][5630] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" HandleID="k8s-pod-network.6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.702 [INFO][5630] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:13.713151 containerd[1974]: 2026-04-25 00:00:13.706 [INFO][5622] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:13.718932 containerd[1974]: time="2026-04-25T00:00:13.716900399Z" level=info msg="TearDown network for sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\" successfully" Apr 25 00:00:13.718932 containerd[1974]: time="2026-04-25T00:00:13.716987820Z" level=info msg="StopPodSandbox for \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\" returns successfully" Apr 25 00:00:13.721501 systemd[1]: run-netns-cni\x2d4ef99837\x2d2801\x2df4e5\x2d6e04\x2d34bd7391181b.mount: Deactivated successfully. Apr 25 00:00:13.725372 containerd[1974]: time="2026-04-25T00:00:13.725326359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7564df694c-4d42l,Uid:e3d6ca38-0ac1-41b2-beef-170f1942102f,Namespace:calico-system,Attempt:1,}" Apr 25 00:00:13.909377 systemd-networkd[1892]: cali7764f2db475: Link UP Apr 25 00:00:13.911876 systemd-networkd[1892]: cali7764f2db475: Gained carrier Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.829 [INFO][5644] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0 calico-kube-controllers-7564df694c- calico-system e3d6ca38-0ac1-41b2-beef-170f1942102f 1011 0 2026-04-24 23:59:35 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7564df694c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-31-110 calico-kube-controllers-7564df694c-4d42l eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali7764f2db475 [] [] }} ContainerID="e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" Namespace="calico-system" Pod="calico-kube-controllers-7564df694c-4d42l" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.829 [INFO][5644] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" Namespace="calico-system" Pod="calico-kube-controllers-7564df694c-4d42l" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.858 [INFO][5659] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" HandleID="k8s-pod-network.e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.866 [INFO][5659] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" HandleID="k8s-pod-network.e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002f8090), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-31-110", "pod":"calico-kube-controllers-7564df694c-4d42l", "timestamp":"2026-04-25 00:00:13.858969243 +0000 UTC"}, Hostname:"ip-172-31-31-110", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000640000)} Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.867 [INFO][5659] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.867 [INFO][5659] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.867 [INFO][5659] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-110' Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.870 [INFO][5659] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" host="ip-172-31-31-110" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.876 [INFO][5659] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-31-110" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.881 [INFO][5659] ipam/ipam.go 526: Trying affinity for 192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.883 [INFO][5659] ipam/ipam.go 160: Attempting to load block cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.886 [INFO][5659] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.886 [INFO][5659] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.121.64/26 handle="k8s-pod-network.e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" host="ip-172-31-31-110" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.888 [INFO][5659] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734 Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.893 [INFO][5659] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.121.64/26 handle="k8s-pod-network.e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" host="ip-172-31-31-110" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.902 [INFO][5659] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.121.71/26] block=192.168.121.64/26 handle="k8s-pod-network.e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" host="ip-172-31-31-110" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.902 [INFO][5659] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.121.71/26] handle="k8s-pod-network.e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" host="ip-172-31-31-110" Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.902 [INFO][5659] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:13.931843 containerd[1974]: 2026-04-25 00:00:13.902 [INFO][5659] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.121.71/26] IPv6=[] ContainerID="e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" HandleID="k8s-pod-network.e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:13.934408 containerd[1974]: 2026-04-25 00:00:13.905 [INFO][5644] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" Namespace="calico-system" Pod="calico-kube-controllers-7564df694c-4d42l" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0", GenerateName:"calico-kube-controllers-7564df694c-", Namespace:"calico-system", SelfLink:"", UID:"e3d6ca38-0ac1-41b2-beef-170f1942102f", ResourceVersion:"1011", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7564df694c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"", Pod:"calico-kube-controllers-7564df694c-4d42l", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.121.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7764f2db475", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:13.934408 containerd[1974]: 2026-04-25 00:00:13.905 [INFO][5644] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.71/32] ContainerID="e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" Namespace="calico-system" Pod="calico-kube-controllers-7564df694c-4d42l" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:13.934408 containerd[1974]: 2026-04-25 00:00:13.905 [INFO][5644] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7764f2db475 ContainerID="e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" Namespace="calico-system" Pod="calico-kube-controllers-7564df694c-4d42l" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:13.934408 containerd[1974]: 2026-04-25 00:00:13.910 [INFO][5644] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" Namespace="calico-system" Pod="calico-kube-controllers-7564df694c-4d42l" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:13.934408 containerd[1974]: 2026-04-25 00:00:13.910 [INFO][5644] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" Namespace="calico-system" Pod="calico-kube-controllers-7564df694c-4d42l" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0", GenerateName:"calico-kube-controllers-7564df694c-", Namespace:"calico-system", SelfLink:"", UID:"e3d6ca38-0ac1-41b2-beef-170f1942102f", ResourceVersion:"1011", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7564df694c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734", Pod:"calico-kube-controllers-7564df694c-4d42l", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.121.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7764f2db475", MAC:"ae:b3:56:8d:8c:6b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:13.934408 containerd[1974]: 2026-04-25 00:00:13.927 [INFO][5644] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734" Namespace="calico-system" Pod="calico-kube-controllers-7564df694c-4d42l" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:13.968315 systemd[1]: Started sshd@7-172.31.31.110:22-4.175.71.9:38506.service - OpenSSH per-connection server daemon (4.175.71.9:38506). Apr 25 00:00:13.985021 systemd-networkd[1892]: cali4fd2cafc68b: Gained IPv6LL Apr 25 00:00:13.990480 containerd[1974]: time="2026-04-25T00:00:13.990344276Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 25 00:00:13.990855 containerd[1974]: time="2026-04-25T00:00:13.990749398Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 25 00:00:13.991128 containerd[1974]: time="2026-04-25T00:00:13.991074206Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:13.991531 containerd[1974]: time="2026-04-25T00:00:13.991492879Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:14.054014 systemd[1]: Started cri-containerd-e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734.scope - libcontainer container e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734. Apr 25 00:00:14.126594 containerd[1974]: time="2026-04-25T00:00:14.126551446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7564df694c-4d42l,Uid:e3d6ca38-0ac1-41b2-beef-170f1942102f,Namespace:calico-system,Attempt:1,} returns sandbox id \"e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734\"" Apr 25 00:00:14.532729 containerd[1974]: time="2026-04-25T00:00:14.532397650Z" level=info msg="StopPodSandbox for \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\"" Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.587 [INFO][5737] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.587 [INFO][5737] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" iface="eth0" netns="/var/run/netns/cni-cd946d82-f8a1-8c16-ea9b-130b2fb759ab" Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.588 [INFO][5737] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" iface="eth0" netns="/var/run/netns/cni-cd946d82-f8a1-8c16-ea9b-130b2fb759ab" Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.588 [INFO][5737] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" iface="eth0" netns="/var/run/netns/cni-cd946d82-f8a1-8c16-ea9b-130b2fb759ab" Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.588 [INFO][5737] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.588 [INFO][5737] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.615 [INFO][5744] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" HandleID="k8s-pod-network.47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.615 [INFO][5744] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.615 [INFO][5744] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.622 [WARNING][5744] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" HandleID="k8s-pod-network.47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.622 [INFO][5744] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" HandleID="k8s-pod-network.47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.624 [INFO][5744] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:14.628195 containerd[1974]: 2026-04-25 00:00:14.626 [INFO][5737] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:14.632534 containerd[1974]: time="2026-04-25T00:00:14.628599979Z" level=info msg="TearDown network for sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\" successfully" Apr 25 00:00:14.632534 containerd[1974]: time="2026-04-25T00:00:14.628633089Z" level=info msg="StopPodSandbox for \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\" returns successfully" Apr 25 00:00:14.634244 containerd[1974]: time="2026-04-25T00:00:14.633819094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848b4486d7-brf5p,Uid:f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2,Namespace:calico-system,Attempt:1,}" Apr 25 00:00:14.635381 systemd[1]: run-netns-cni\x2dcd946d82\x2df8a1\x2d8c16\x2dea9b\x2d130b2fb759ab.mount: Deactivated successfully. Apr 25 00:00:14.789262 systemd-networkd[1892]: cali5334212dd79: Link UP Apr 25 00:00:14.791652 systemd-networkd[1892]: cali5334212dd79: Gained carrier Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.699 [INFO][5752] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0 calico-apiserver-848b4486d7- calico-system f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2 1041 0 2026-04-24 23:59:34 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:848b4486d7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-31-110 calico-apiserver-848b4486d7-brf5p eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali5334212dd79 [] [] }} ContainerID="6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-brf5p" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.699 [INFO][5752] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-brf5p" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.732 [INFO][5763] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" HandleID="k8s-pod-network.6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.743 [INFO][5763] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" HandleID="k8s-pod-network.6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ef870), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-31-110", "pod":"calico-apiserver-848b4486d7-brf5p", "timestamp":"2026-04-25 00:00:14.73263843 +0000 UTC"}, Hostname:"ip-172-31-31-110", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002e7340)} Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.743 [INFO][5763] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.743 [INFO][5763] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.743 [INFO][5763] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-31-110' Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.746 [INFO][5763] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" host="ip-172-31-31-110" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.751 [INFO][5763] ipam/ipam.go 409: Looking up existing affinities for host host="ip-172-31-31-110" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.757 [INFO][5763] ipam/ipam.go 526: Trying affinity for 192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.760 [INFO][5763] ipam/ipam.go 160: Attempting to load block cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.762 [INFO][5763] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.121.64/26 host="ip-172-31-31-110" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.762 [INFO][5763] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.121.64/26 handle="k8s-pod-network.6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" host="ip-172-31-31-110" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.764 [INFO][5763] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68 Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.772 [INFO][5763] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.121.64/26 handle="k8s-pod-network.6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" host="ip-172-31-31-110" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.782 [INFO][5763] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.121.72/26] block=192.168.121.64/26 handle="k8s-pod-network.6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" host="ip-172-31-31-110" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.783 [INFO][5763] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.121.72/26] handle="k8s-pod-network.6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" host="ip-172-31-31-110" Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.783 [INFO][5763] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:14.829328 containerd[1974]: 2026-04-25 00:00:14.783 [INFO][5763] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.121.72/26] IPv6=[] ContainerID="6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" HandleID="k8s-pod-network.6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:14.831727 containerd[1974]: 2026-04-25 00:00:14.785 [INFO][5752] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-brf5p" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0", GenerateName:"calico-apiserver-848b4486d7-", Namespace:"calico-system", SelfLink:"", UID:"f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2", ResourceVersion:"1041", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848b4486d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"", Pod:"calico-apiserver-848b4486d7-brf5p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali5334212dd79", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:14.831727 containerd[1974]: 2026-04-25 00:00:14.786 [INFO][5752] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.121.72/32] ContainerID="6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-brf5p" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:14.831727 containerd[1974]: 2026-04-25 00:00:14.786 [INFO][5752] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5334212dd79 ContainerID="6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-brf5p" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:14.831727 containerd[1974]: 2026-04-25 00:00:14.793 [INFO][5752] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-brf5p" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:14.831727 containerd[1974]: 2026-04-25 00:00:14.800 [INFO][5752] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-brf5p" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0", GenerateName:"calico-apiserver-848b4486d7-", Namespace:"calico-system", SelfLink:"", UID:"f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2", ResourceVersion:"1041", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848b4486d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68", Pod:"calico-apiserver-848b4486d7-brf5p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali5334212dd79", MAC:"8e:e3:08:6b:dd:ee", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:14.831727 containerd[1974]: 2026-04-25 00:00:14.823 [INFO][5752] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68" Namespace="calico-system" Pod="calico-apiserver-848b4486d7-brf5p" WorkloadEndpoint="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:14.875809 containerd[1974]: time="2026-04-25T00:00:14.875089185Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 25 00:00:14.876655 containerd[1974]: time="2026-04-25T00:00:14.876297265Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 25 00:00:14.876655 containerd[1974]: time="2026-04-25T00:00:14.876388934Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:14.876655 containerd[1974]: time="2026-04-25T00:00:14.876515460Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 25 00:00:14.912346 systemd[1]: Started cri-containerd-6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68.scope - libcontainer container 6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68. Apr 25 00:00:14.944985 systemd-networkd[1892]: cali86641ffeb98: Gained IPv6LL Apr 25 00:00:15.011106 containerd[1974]: time="2026-04-25T00:00:15.010371929Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-848b4486d7-brf5p,Uid:f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2,Namespace:calico-system,Attempt:1,} returns sandbox id \"6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68\"" Apr 25 00:00:15.010486 systemd-networkd[1892]: cali7764f2db475: Gained IPv6LL Apr 25 00:00:15.087421 sshd[5682]: Accepted publickey for core from 4.175.71.9 port 38506 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:00:15.091638 sshd[5682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:00:15.097674 systemd-logind[1954]: New session 8 of user core. Apr 25 00:00:15.101990 systemd[1]: Started session-8.scope - Session 8 of User core. Apr 25 00:00:16.032984 systemd-networkd[1892]: cali5334212dd79: Gained IPv6LL Apr 25 00:00:16.520554 sshd[5682]: pam_unix(sshd:session): session closed for user core Apr 25 00:00:16.524548 systemd[1]: sshd@7-172.31.31.110:22-4.175.71.9:38506.service: Deactivated successfully. Apr 25 00:00:16.527542 systemd[1]: session-8.scope: Deactivated successfully. Apr 25 00:00:16.529460 systemd-logind[1954]: Session 8 logged out. Waiting for processes to exit. Apr 25 00:00:16.531777 systemd-logind[1954]: Removed session 8. Apr 25 00:00:17.525949 containerd[1974]: time="2026-04-25T00:00:17.525549582Z" level=info msg="StopPodSandbox for \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\"" Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.598 [WARNING][5875] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"b17085d6-14a5-4b5e-81f9-314340899d1e", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76", Pod:"goldmane-9f7667bb8-vg777", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.121.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0ccb425bc32", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.598 [INFO][5875] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.598 [INFO][5875] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" iface="eth0" netns="" Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.598 [INFO][5875] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.598 [INFO][5875] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.654 [INFO][5884] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" HandleID="k8s-pod-network.a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.654 [INFO][5884] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.654 [INFO][5884] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.661 [WARNING][5884] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" HandleID="k8s-pod-network.a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.661 [INFO][5884] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" HandleID="k8s-pod-network.a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.663 [INFO][5884] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:17.667384 containerd[1974]: 2026-04-25 00:00:17.665 [INFO][5875] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:17.667384 containerd[1974]: time="2026-04-25T00:00:17.667258167Z" level=info msg="TearDown network for sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\" successfully" Apr 25 00:00:17.667384 containerd[1974]: time="2026-04-25T00:00:17.667301178Z" level=info msg="StopPodSandbox for \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\" returns successfully" Apr 25 00:00:17.697348 containerd[1974]: time="2026-04-25T00:00:17.697283283Z" level=info msg="RemovePodSandbox for \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\"" Apr 25 00:00:17.697348 containerd[1974]: time="2026-04-25T00:00:17.697336123Z" level=info msg="Forcibly stopping sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\"" Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.742 [WARNING][5898] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"b17085d6-14a5-4b5e-81f9-314340899d1e", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76", Pod:"goldmane-9f7667bb8-vg777", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.121.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0ccb425bc32", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.742 [INFO][5898] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.742 [INFO][5898] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" iface="eth0" netns="" Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.742 [INFO][5898] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.742 [INFO][5898] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.771 [INFO][5905] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" HandleID="k8s-pod-network.a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.771 [INFO][5905] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.771 [INFO][5905] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.779 [WARNING][5905] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" HandleID="k8s-pod-network.a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.779 [INFO][5905] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" HandleID="k8s-pod-network.a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Workload="ip--172--31--31--110-k8s-goldmane--9f7667bb8--vg777-eth0" Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.781 [INFO][5905] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:17.785918 containerd[1974]: 2026-04-25 00:00:17.783 [INFO][5898] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe" Apr 25 00:00:17.785918 containerd[1974]: time="2026-04-25T00:00:17.785680662Z" level=info msg="TearDown network for sandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\" successfully" Apr 25 00:00:17.809649 containerd[1974]: time="2026-04-25T00:00:17.809581247Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 25 00:00:17.809843 containerd[1974]: time="2026-04-25T00:00:17.809683384Z" level=info msg="RemovePodSandbox \"a57e8fbabf45599afcad53283ca073fec4143ba9c8fd2c1081470b3c251d3dbe\" returns successfully" Apr 25 00:00:17.810393 containerd[1974]: time="2026-04-25T00:00:17.810352494Z" level=info msg="StopPodSandbox for \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\"" Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.856 [WARNING][5920] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0", GenerateName:"calico-apiserver-848b4486d7-", Namespace:"calico-system", SelfLink:"", UID:"f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2", ResourceVersion:"1044", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848b4486d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68", Pod:"calico-apiserver-848b4486d7-brf5p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali5334212dd79", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.856 [INFO][5920] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.856 [INFO][5920] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" iface="eth0" netns="" Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.856 [INFO][5920] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.856 [INFO][5920] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.913 [INFO][5928] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" HandleID="k8s-pod-network.47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.913 [INFO][5928] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.913 [INFO][5928] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.920 [WARNING][5928] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" HandleID="k8s-pod-network.47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.920 [INFO][5928] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" HandleID="k8s-pod-network.47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.922 [INFO][5928] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:17.926583 containerd[1974]: 2026-04-25 00:00:17.924 [INFO][5920] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:17.927945 containerd[1974]: time="2026-04-25T00:00:17.926633873Z" level=info msg="TearDown network for sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\" successfully" Apr 25 00:00:17.927945 containerd[1974]: time="2026-04-25T00:00:17.926663700Z" level=info msg="StopPodSandbox for \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\" returns successfully" Apr 25 00:00:17.927945 containerd[1974]: time="2026-04-25T00:00:17.927263137Z" level=info msg="RemovePodSandbox for \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\"" Apr 25 00:00:17.927945 containerd[1974]: time="2026-04-25T00:00:17.927296506Z" level=info msg="Forcibly stopping sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\"" Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:17.967 [WARNING][5946] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0", GenerateName:"calico-apiserver-848b4486d7-", Namespace:"calico-system", SelfLink:"", UID:"f1ad7ed5-b378-4647-9ebf-2546b6a4e4a2", ResourceVersion:"1044", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848b4486d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68", Pod:"calico-apiserver-848b4486d7-brf5p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali5334212dd79", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:17.968 [INFO][5946] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:17.968 [INFO][5946] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" iface="eth0" netns="" Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:17.968 [INFO][5946] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:17.968 [INFO][5946] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:17.996 [INFO][5953] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" HandleID="k8s-pod-network.47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:17.996 [INFO][5953] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:17.996 [INFO][5953] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:18.003 [WARNING][5953] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" HandleID="k8s-pod-network.47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:18.003 [INFO][5953] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" HandleID="k8s-pod-network.47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--brf5p-eth0" Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:18.004 [INFO][5953] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:18.009185 containerd[1974]: 2026-04-25 00:00:18.006 [INFO][5946] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b" Apr 25 00:00:18.010605 containerd[1974]: time="2026-04-25T00:00:18.009228771Z" level=info msg="TearDown network for sandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\" successfully" Apr 25 00:00:18.015558 containerd[1974]: time="2026-04-25T00:00:18.015502535Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 25 00:00:18.015710 containerd[1974]: time="2026-04-25T00:00:18.015579426Z" level=info msg="RemovePodSandbox \"47634a8599af1eac2e1de64fbc38d40803a22e32f2699057edc0a4d8b0f8fe2b\" returns successfully" Apr 25 00:00:18.016200 containerd[1974]: time="2026-04-25T00:00:18.016168877Z" level=info msg="StopPodSandbox for \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\"" Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.058 [WARNING][5967] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0", GenerateName:"calico-kube-controllers-7564df694c-", Namespace:"calico-system", SelfLink:"", UID:"e3d6ca38-0ac1-41b2-beef-170f1942102f", ResourceVersion:"1034", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7564df694c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734", Pod:"calico-kube-controllers-7564df694c-4d42l", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.121.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7764f2db475", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.058 [INFO][5967] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.058 [INFO][5967] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" iface="eth0" netns="" Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.058 [INFO][5967] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.058 [INFO][5967] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.087 [INFO][5974] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" HandleID="k8s-pod-network.6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.087 [INFO][5974] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.087 [INFO][5974] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.097 [WARNING][5974] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" HandleID="k8s-pod-network.6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.097 [INFO][5974] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" HandleID="k8s-pod-network.6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.100 [INFO][5974] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:18.106178 containerd[1974]: 2026-04-25 00:00:18.102 [INFO][5967] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:18.106178 containerd[1974]: time="2026-04-25T00:00:18.106052924Z" level=info msg="TearDown network for sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\" successfully" Apr 25 00:00:18.106178 containerd[1974]: time="2026-04-25T00:00:18.106096197Z" level=info msg="StopPodSandbox for \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\" returns successfully" Apr 25 00:00:18.107135 containerd[1974]: time="2026-04-25T00:00:18.106955267Z" level=info msg="RemovePodSandbox for \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\"" Apr 25 00:00:18.107874 containerd[1974]: time="2026-04-25T00:00:18.107386077Z" level=info msg="Forcibly stopping sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\"" Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.153 [WARNING][5989] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0", GenerateName:"calico-kube-controllers-7564df694c-", Namespace:"calico-system", SelfLink:"", UID:"e3d6ca38-0ac1-41b2-beef-170f1942102f", ResourceVersion:"1034", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7564df694c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734", Pod:"calico-kube-controllers-7564df694c-4d42l", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.121.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7764f2db475", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.153 [INFO][5989] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.153 [INFO][5989] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" iface="eth0" netns="" Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.153 [INFO][5989] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.153 [INFO][5989] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.180 [INFO][5997] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" HandleID="k8s-pod-network.6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.180 [INFO][5997] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.180 [INFO][5997] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.188 [WARNING][5997] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" HandleID="k8s-pod-network.6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.188 [INFO][5997] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" HandleID="k8s-pod-network.6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Workload="ip--172--31--31--110-k8s-calico--kube--controllers--7564df694c--4d42l-eth0" Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.190 [INFO][5997] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:18.194895 containerd[1974]: 2026-04-25 00:00:18.192 [INFO][5989] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2" Apr 25 00:00:18.194895 containerd[1974]: time="2026-04-25T00:00:18.194646864Z" level=info msg="TearDown network for sandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\" successfully" Apr 25 00:00:18.208595 containerd[1974]: time="2026-04-25T00:00:18.208474556Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 25 00:00:18.217101 containerd[1974]: time="2026-04-25T00:00:18.217048995Z" level=info msg="RemovePodSandbox \"6c23267bc67df8fde82032bbee6b76532d9447c49eff630a05a6bc648e99c3a2\" returns successfully" Apr 25 00:00:18.217677 containerd[1974]: time="2026-04-25T00:00:18.217649104Z" level=info msg="StopPodSandbox for \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\"" Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.258 [WARNING][6011] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"8259ae9f-09c1-45c5-a26a-9fb19e805b35", ResourceVersion:"984", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889", Pod:"coredns-7d764666f9-q5wxg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib407ef0a420", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.258 [INFO][6011] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.258 [INFO][6011] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" iface="eth0" netns="" Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.258 [INFO][6011] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.258 [INFO][6011] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.293 [INFO][6018] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" HandleID="k8s-pod-network.bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.293 [INFO][6018] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.294 [INFO][6018] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.300 [WARNING][6018] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" HandleID="k8s-pod-network.bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.300 [INFO][6018] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" HandleID="k8s-pod-network.bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.302 [INFO][6018] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:18.306667 containerd[1974]: 2026-04-25 00:00:18.304 [INFO][6011] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:18.308052 containerd[1974]: time="2026-04-25T00:00:18.306714018Z" level=info msg="TearDown network for sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\" successfully" Apr 25 00:00:18.308052 containerd[1974]: time="2026-04-25T00:00:18.306742987Z" level=info msg="StopPodSandbox for \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\" returns successfully" Apr 25 00:00:18.308052 containerd[1974]: time="2026-04-25T00:00:18.307378147Z" level=info msg="RemovePodSandbox for \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\"" Apr 25 00:00:18.308052 containerd[1974]: time="2026-04-25T00:00:18.307422380Z" level=info msg="Forcibly stopping sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\"" Apr 25 00:00:18.364762 ntpd[1945]: Listen normally on 11 cali0ccb425bc32 [fe80::ecee:eeff:feee:eeee%8]:123 Apr 25 00:00:18.365607 ntpd[1945]: 25 Apr 00:00:18 ntpd[1945]: Listen normally on 11 cali0ccb425bc32 [fe80::ecee:eeff:feee:eeee%8]:123 Apr 25 00:00:18.365607 ntpd[1945]: 25 Apr 00:00:18 ntpd[1945]: Listen normally on 12 calib407ef0a420 [fe80::ecee:eeff:feee:eeee%9]:123 Apr 25 00:00:18.365607 ntpd[1945]: 25 Apr 00:00:18 ntpd[1945]: Listen normally on 13 cali7235609975d [fe80::ecee:eeff:feee:eeee%10]:123 Apr 25 00:00:18.365607 ntpd[1945]: 25 Apr 00:00:18 ntpd[1945]: Listen normally on 14 cali4fd2cafc68b [fe80::ecee:eeff:feee:eeee%11]:123 Apr 25 00:00:18.365607 ntpd[1945]: 25 Apr 00:00:18 ntpd[1945]: Listen normally on 15 cali86641ffeb98 [fe80::ecee:eeff:feee:eeee%12]:123 Apr 25 00:00:18.365607 ntpd[1945]: 25 Apr 00:00:18 ntpd[1945]: Listen normally on 16 cali7764f2db475 [fe80::ecee:eeff:feee:eeee%13]:123 Apr 25 00:00:18.365607 ntpd[1945]: 25 Apr 00:00:18 ntpd[1945]: Listen normally on 17 cali5334212dd79 [fe80::ecee:eeff:feee:eeee%14]:123 Apr 25 00:00:18.364876 ntpd[1945]: Listen normally on 12 calib407ef0a420 [fe80::ecee:eeff:feee:eeee%9]:123 Apr 25 00:00:18.364921 ntpd[1945]: Listen normally on 13 cali7235609975d [fe80::ecee:eeff:feee:eeee%10]:123 Apr 25 00:00:18.364964 ntpd[1945]: Listen normally on 14 cali4fd2cafc68b [fe80::ecee:eeff:feee:eeee%11]:123 Apr 25 00:00:18.365003 ntpd[1945]: Listen normally on 15 cali86641ffeb98 [fe80::ecee:eeff:feee:eeee%12]:123 Apr 25 00:00:18.365043 ntpd[1945]: Listen normally on 16 cali7764f2db475 [fe80::ecee:eeff:feee:eeee%13]:123 Apr 25 00:00:18.365081 ntpd[1945]: Listen normally on 17 cali5334212dd79 [fe80::ecee:eeff:feee:eeee%14]:123 Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.349 [WARNING][6032] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"8259ae9f-09c1-45c5-a26a-9fb19e805b35", ResourceVersion:"984", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"595666868a10545027a0ef04ca20123530e33581d543e85ef4f4821b2478f889", Pod:"coredns-7d764666f9-q5wxg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib407ef0a420", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.349 [INFO][6032] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.349 [INFO][6032] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" iface="eth0" netns="" Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.349 [INFO][6032] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.349 [INFO][6032] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.377 [INFO][6040] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" HandleID="k8s-pod-network.bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.378 [INFO][6040] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.378 [INFO][6040] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.386 [WARNING][6040] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" HandleID="k8s-pod-network.bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.387 [INFO][6040] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" HandleID="k8s-pod-network.bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--q5wxg-eth0" Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.389 [INFO][6040] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:18.393084 containerd[1974]: 2026-04-25 00:00:18.391 [INFO][6032] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703" Apr 25 00:00:18.393832 containerd[1974]: time="2026-04-25T00:00:18.393151949Z" level=info msg="TearDown network for sandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\" successfully" Apr 25 00:00:18.399406 containerd[1974]: time="2026-04-25T00:00:18.399210838Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 25 00:00:18.399406 containerd[1974]: time="2026-04-25T00:00:18.399306201Z" level=info msg="RemovePodSandbox \"bebeb009b41842948801598b7bcb38ad25d4878ba127c98bfdec429caa35f703\" returns successfully" Apr 25 00:00:18.400174 containerd[1974]: time="2026-04-25T00:00:18.399844404Z" level=info msg="StopPodSandbox for \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\"" Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.447 [WARNING][6055] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" WorkloadEndpoint="ip--172--31--31--110-k8s-whisker--6c4c456bdb--82zbl-eth0" Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.447 [INFO][6055] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.447 [INFO][6055] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" iface="eth0" netns="" Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.447 [INFO][6055] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.447 [INFO][6055] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.477 [INFO][6063] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" HandleID="k8s-pod-network.a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Workload="ip--172--31--31--110-k8s-whisker--6c4c456bdb--82zbl-eth0" Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.477 [INFO][6063] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.477 [INFO][6063] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.484 [WARNING][6063] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" HandleID="k8s-pod-network.a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Workload="ip--172--31--31--110-k8s-whisker--6c4c456bdb--82zbl-eth0" Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.484 [INFO][6063] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" HandleID="k8s-pod-network.a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Workload="ip--172--31--31--110-k8s-whisker--6c4c456bdb--82zbl-eth0" Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.486 [INFO][6063] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:18.490808 containerd[1974]: 2026-04-25 00:00:18.488 [INFO][6055] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:18.491682 containerd[1974]: time="2026-04-25T00:00:18.490864732Z" level=info msg="TearDown network for sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\" successfully" Apr 25 00:00:18.491682 containerd[1974]: time="2026-04-25T00:00:18.490902468Z" level=info msg="StopPodSandbox for \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\" returns successfully" Apr 25 00:00:18.491682 containerd[1974]: time="2026-04-25T00:00:18.491666284Z" level=info msg="RemovePodSandbox for \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\"" Apr 25 00:00:18.491866 containerd[1974]: time="2026-04-25T00:00:18.491702280Z" level=info msg="Forcibly stopping sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\"" Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.533 [WARNING][6077] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" WorkloadEndpoint="ip--172--31--31--110-k8s-whisker--6c4c456bdb--82zbl-eth0" Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.533 [INFO][6077] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.533 [INFO][6077] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" iface="eth0" netns="" Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.533 [INFO][6077] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.533 [INFO][6077] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.568 [INFO][6084] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" HandleID="k8s-pod-network.a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Workload="ip--172--31--31--110-k8s-whisker--6c4c456bdb--82zbl-eth0" Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.568 [INFO][6084] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.569 [INFO][6084] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.578 [WARNING][6084] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" HandleID="k8s-pod-network.a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Workload="ip--172--31--31--110-k8s-whisker--6c4c456bdb--82zbl-eth0" Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.579 [INFO][6084] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" HandleID="k8s-pod-network.a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Workload="ip--172--31--31--110-k8s-whisker--6c4c456bdb--82zbl-eth0" Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.581 [INFO][6084] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:18.586141 containerd[1974]: 2026-04-25 00:00:18.584 [INFO][6077] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9" Apr 25 00:00:18.586141 containerd[1974]: time="2026-04-25T00:00:18.586016903Z" level=info msg="TearDown network for sandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\" successfully" Apr 25 00:00:18.592483 containerd[1974]: time="2026-04-25T00:00:18.592428263Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 25 00:00:18.592635 containerd[1974]: time="2026-04-25T00:00:18.592513793Z" level=info msg="RemovePodSandbox \"a74229bca0141ebbfacbfc3444d797adc52ee7068e24e94e1b8ede0b05b79cb9\" returns successfully" Apr 25 00:00:18.593252 containerd[1974]: time="2026-04-25T00:00:18.593218600Z" level=info msg="StopPodSandbox for \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\"" Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.632 [WARNING][6098] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0", GenerateName:"calico-apiserver-848b4486d7-", Namespace:"calico-system", SelfLink:"", UID:"d9ba9417-7859-4ec0-8a74-659edbbec7c4", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848b4486d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e", Pod:"calico-apiserver-848b4486d7-dvwgq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali86641ffeb98", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.632 [INFO][6098] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.632 [INFO][6098] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" iface="eth0" netns="" Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.632 [INFO][6098] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.632 [INFO][6098] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.658 [INFO][6105] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" HandleID="k8s-pod-network.3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.659 [INFO][6105] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.659 [INFO][6105] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.665 [WARNING][6105] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" HandleID="k8s-pod-network.3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.666 [INFO][6105] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" HandleID="k8s-pod-network.3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.668 [INFO][6105] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:18.673285 containerd[1974]: 2026-04-25 00:00:18.670 [INFO][6098] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:18.674044 containerd[1974]: time="2026-04-25T00:00:18.673332553Z" level=info msg="TearDown network for sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\" successfully" Apr 25 00:00:18.674044 containerd[1974]: time="2026-04-25T00:00:18.673362971Z" level=info msg="StopPodSandbox for \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\" returns successfully" Apr 25 00:00:18.674207 containerd[1974]: time="2026-04-25T00:00:18.674177335Z" level=info msg="RemovePodSandbox for \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\"" Apr 25 00:00:18.674262 containerd[1974]: time="2026-04-25T00:00:18.674229028Z" level=info msg="Forcibly stopping sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\"" Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.718 [WARNING][6119] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0", GenerateName:"calico-apiserver-848b4486d7-", Namespace:"calico-system", SelfLink:"", UID:"d9ba9417-7859-4ec0-8a74-659edbbec7c4", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"848b4486d7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e", Pod:"calico-apiserver-848b4486d7-dvwgq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.121.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali86641ffeb98", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.718 [INFO][6119] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.718 [INFO][6119] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" iface="eth0" netns="" Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.718 [INFO][6119] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.718 [INFO][6119] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.742 [INFO][6126] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" HandleID="k8s-pod-network.3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.742 [INFO][6126] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.743 [INFO][6126] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.749 [WARNING][6126] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" HandleID="k8s-pod-network.3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.749 [INFO][6126] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" HandleID="k8s-pod-network.3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Workload="ip--172--31--31--110-k8s-calico--apiserver--848b4486d7--dvwgq-eth0" Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.751 [INFO][6126] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:18.755703 containerd[1974]: 2026-04-25 00:00:18.753 [INFO][6119] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9" Apr 25 00:00:18.756620 containerd[1974]: time="2026-04-25T00:00:18.755745221Z" level=info msg="TearDown network for sandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\" successfully" Apr 25 00:00:18.763275 containerd[1974]: time="2026-04-25T00:00:18.763226470Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 25 00:00:18.763425 containerd[1974]: time="2026-04-25T00:00:18.763301712Z" level=info msg="RemovePodSandbox \"3941356acab4b724b296dce95d38ff7956ae5900b2224867cf4d9d02ad8ef6d9\" returns successfully" Apr 25 00:00:18.763903 containerd[1974]: time="2026-04-25T00:00:18.763872398Z" level=info msg="StopPodSandbox for \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\"" Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.813 [WARNING][6140] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fafabee1-df27-491d-a48c-611faa0cd932", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567", Pod:"csi-node-driver-wllpb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.121.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7235609975d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.813 [INFO][6140] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.813 [INFO][6140] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" iface="eth0" netns="" Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.813 [INFO][6140] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.813 [INFO][6140] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.839 [INFO][6147] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" HandleID="k8s-pod-network.2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.839 [INFO][6147] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.839 [INFO][6147] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.848 [WARNING][6147] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" HandleID="k8s-pod-network.2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.848 [INFO][6147] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" HandleID="k8s-pod-network.2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.850 [INFO][6147] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:18.854846 containerd[1974]: 2026-04-25 00:00:18.853 [INFO][6140] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:18.855594 containerd[1974]: time="2026-04-25T00:00:18.855556686Z" level=info msg="TearDown network for sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\" successfully" Apr 25 00:00:18.855594 containerd[1974]: time="2026-04-25T00:00:18.855591165Z" level=info msg="StopPodSandbox for \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\" returns successfully" Apr 25 00:00:18.856187 containerd[1974]: time="2026-04-25T00:00:18.856152769Z" level=info msg="RemovePodSandbox for \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\"" Apr 25 00:00:18.856295 containerd[1974]: time="2026-04-25T00:00:18.856192161Z" level=info msg="Forcibly stopping sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\"" Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.898 [WARNING][6161] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fafabee1-df27-491d-a48c-611faa0cd932", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567", Pod:"csi-node-driver-wllpb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.121.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7235609975d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.899 [INFO][6161] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.899 [INFO][6161] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" iface="eth0" netns="" Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.899 [INFO][6161] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.899 [INFO][6161] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.925 [INFO][6169] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" HandleID="k8s-pod-network.2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.925 [INFO][6169] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.925 [INFO][6169] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.933 [WARNING][6169] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" HandleID="k8s-pod-network.2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.933 [INFO][6169] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" HandleID="k8s-pod-network.2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Workload="ip--172--31--31--110-k8s-csi--node--driver--wllpb-eth0" Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.935 [INFO][6169] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:18.939832 containerd[1974]: 2026-04-25 00:00:18.937 [INFO][6161] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef" Apr 25 00:00:18.939832 containerd[1974]: time="2026-04-25T00:00:18.939650912Z" level=info msg="TearDown network for sandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\" successfully" Apr 25 00:00:18.946232 containerd[1974]: time="2026-04-25T00:00:18.946181352Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 25 00:00:18.946353 containerd[1974]: time="2026-04-25T00:00:18.946275332Z" level=info msg="RemovePodSandbox \"2ce3b1122d54a7daa4a347d9e605f0fec8802b4c393fbb7e2fa28d979eca17ef\" returns successfully" Apr 25 00:00:18.946822 containerd[1974]: time="2026-04-25T00:00:18.946769414Z" level=info msg="StopPodSandbox for \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\"" Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:18.989 [WARNING][6183] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"651a4b24-dccb-49b4-b5f3-a4291d6e49f0", ResourceVersion:"988", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d", Pod:"coredns-7d764666f9-zk545", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4fd2cafc68b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:18.989 [INFO][6183] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:18.989 [INFO][6183] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" iface="eth0" netns="" Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:18.989 [INFO][6183] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:18.989 [INFO][6183] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:19.014 [INFO][6191] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" HandleID="k8s-pod-network.2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:19.014 [INFO][6191] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:19.014 [INFO][6191] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:19.021 [WARNING][6191] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" HandleID="k8s-pod-network.2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:19.021 [INFO][6191] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" HandleID="k8s-pod-network.2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:19.023 [INFO][6191] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:19.027282 containerd[1974]: 2026-04-25 00:00:19.025 [INFO][6183] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:19.028138 containerd[1974]: time="2026-04-25T00:00:19.027327960Z" level=info msg="TearDown network for sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\" successfully" Apr 25 00:00:19.028138 containerd[1974]: time="2026-04-25T00:00:19.027357764Z" level=info msg="StopPodSandbox for \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\" returns successfully" Apr 25 00:00:19.028572 containerd[1974]: time="2026-04-25T00:00:19.028536482Z" level=info msg="RemovePodSandbox for \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\"" Apr 25 00:00:19.028672 containerd[1974]: time="2026-04-25T00:00:19.028572166Z" level=info msg="Forcibly stopping sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\"" Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.070 [WARNING][6205] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"651a4b24-dccb-49b4-b5f3-a4291d6e49f0", ResourceVersion:"988", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 59, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-31-110", ContainerID:"6e4dfc183e3f3f058f2bb62e0188cd0e16dd0c2bf85bc6231b6eb750ebe9a32d", Pod:"coredns-7d764666f9-zk545", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.121.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4fd2cafc68b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.070 [INFO][6205] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.070 [INFO][6205] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" iface="eth0" netns="" Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.070 [INFO][6205] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.070 [INFO][6205] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.095 [INFO][6212] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" HandleID="k8s-pod-network.2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.095 [INFO][6212] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.095 [INFO][6212] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.101 [WARNING][6212] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" HandleID="k8s-pod-network.2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.101 [INFO][6212] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" HandleID="k8s-pod-network.2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Workload="ip--172--31--31--110-k8s-coredns--7d764666f9--zk545-eth0" Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.103 [INFO][6212] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 25 00:00:19.107756 containerd[1974]: 2026-04-25 00:00:19.105 [INFO][6205] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9" Apr 25 00:00:19.108465 containerd[1974]: time="2026-04-25T00:00:19.107821283Z" level=info msg="TearDown network for sandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\" successfully" Apr 25 00:00:19.115445 containerd[1974]: time="2026-04-25T00:00:19.114172762Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 25 00:00:19.115445 containerd[1974]: time="2026-04-25T00:00:19.114240015Z" level=info msg="RemovePodSandbox \"2ee4d775ce5c79150773d1ce80c8df3b1c5c39e88a99f47d9ced5460c65af0d9\" returns successfully" Apr 25 00:00:21.701165 systemd[1]: Started sshd@8-172.31.31.110:22-4.175.71.9:34348.service - OpenSSH per-connection server daemon (4.175.71.9:34348). Apr 25 00:00:22.759609 sshd[6225]: Accepted publickey for core from 4.175.71.9 port 34348 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:00:22.763618 sshd[6225]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:00:22.768856 systemd-logind[1954]: New session 9 of user core. Apr 25 00:00:22.775354 systemd[1]: Started session-9.scope - Session 9 of User core. Apr 25 00:00:23.604533 sshd[6225]: pam_unix(sshd:session): session closed for user core Apr 25 00:00:23.610464 systemd[1]: sshd@8-172.31.31.110:22-4.175.71.9:34348.service: Deactivated successfully. Apr 25 00:00:23.610833 systemd-logind[1954]: Session 9 logged out. Waiting for processes to exit. Apr 25 00:00:23.613529 systemd[1]: session-9.scope: Deactivated successfully. Apr 25 00:00:23.614714 systemd-logind[1954]: Removed session 9. Apr 25 00:00:28.788765 systemd[1]: Started sshd@9-172.31.31.110:22-4.175.71.9:35624.service - OpenSSH per-connection server daemon (4.175.71.9:35624). Apr 25 00:00:29.803747 sshd[6259]: Accepted publickey for core from 4.175.71.9 port 35624 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:00:29.805392 sshd[6259]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:00:29.810830 systemd-logind[1954]: New session 10 of user core. Apr 25 00:00:29.819145 systemd[1]: Started session-10.scope - Session 10 of User core. Apr 25 00:00:30.589511 sshd[6259]: pam_unix(sshd:session): session closed for user core Apr 25 00:00:30.593439 systemd-logind[1954]: Session 10 logged out. Waiting for processes to exit. Apr 25 00:00:30.594492 systemd[1]: sshd@9-172.31.31.110:22-4.175.71.9:35624.service: Deactivated successfully. Apr 25 00:00:30.597347 systemd[1]: session-10.scope: Deactivated successfully. Apr 25 00:00:30.599045 systemd-logind[1954]: Removed session 10. Apr 25 00:00:35.766160 systemd[1]: Started sshd@10-172.31.31.110:22-4.175.71.9:58042.service - OpenSSH per-connection server daemon (4.175.71.9:58042). Apr 25 00:00:36.804716 sshd[6300]: Accepted publickey for core from 4.175.71.9 port 58042 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:00:36.807961 sshd[6300]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:00:36.814106 systemd-logind[1954]: New session 11 of user core. Apr 25 00:00:36.817002 systemd[1]: Started session-11.scope - Session 11 of User core. Apr 25 00:00:37.641749 sshd[6300]: pam_unix(sshd:session): session closed for user core Apr 25 00:00:37.646082 systemd-logind[1954]: Session 11 logged out. Waiting for processes to exit. Apr 25 00:00:37.647303 systemd[1]: sshd@10-172.31.31.110:22-4.175.71.9:58042.service: Deactivated successfully. Apr 25 00:00:37.649555 systemd[1]: session-11.scope: Deactivated successfully. Apr 25 00:00:37.650708 systemd-logind[1954]: Removed session 11. Apr 25 00:00:42.093473 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1294499110.mount: Deactivated successfully. Apr 25 00:00:42.830069 systemd[1]: Started sshd@11-172.31.31.110:22-4.175.71.9:58044.service - OpenSSH per-connection server daemon (4.175.71.9:58044). Apr 25 00:00:43.925022 sshd[6322]: Accepted publickey for core from 4.175.71.9 port 58044 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:00:43.930104 sshd[6322]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:00:43.935660 systemd-logind[1954]: New session 12 of user core. Apr 25 00:00:43.943183 systemd[1]: Started session-12.scope - Session 12 of User core. Apr 25 00:00:44.921566 sshd[6322]: pam_unix(sshd:session): session closed for user core Apr 25 00:00:44.926559 systemd-logind[1954]: Session 12 logged out. Waiting for processes to exit. Apr 25 00:00:44.927479 systemd[1]: sshd@11-172.31.31.110:22-4.175.71.9:58044.service: Deactivated successfully. Apr 25 00:00:44.929757 systemd[1]: session-12.scope: Deactivated successfully. Apr 25 00:00:44.932949 systemd-logind[1954]: Removed session 12. Apr 25 00:00:44.949586 containerd[1974]: time="2026-04-25T00:00:44.949190357Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Apr 25 00:00:44.959760 containerd[1974]: time="2026-04-25T00:00:44.959707080Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:44.971592 containerd[1974]: time="2026-04-25T00:00:44.971517964Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:44.975294 containerd[1974]: time="2026-04-25T00:00:44.975246789Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:44.977048 containerd[1974]: time="2026-04-25T00:00:44.976851775Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 33.333663921s" Apr 25 00:00:44.977048 containerd[1974]: time="2026-04-25T00:00:44.976898702Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Apr 25 00:00:45.010758 containerd[1974]: time="2026-04-25T00:00:45.010713894Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Apr 25 00:00:45.280943 containerd[1974]: time="2026-04-25T00:00:45.280240000Z" level=info msg="CreateContainer within sandbox \"8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Apr 25 00:00:45.355938 containerd[1974]: time="2026-04-25T00:00:45.355890010Z" level=info msg="CreateContainer within sandbox \"8e6bb12e5b853269250e39c8c0eabd6aff9fbce20a201dd5ec424e48129a5c76\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"91f0c413e986226204f99ea7f70f7c5f98860d8f3bab8c1c4efb1724d80cf348\"" Apr 25 00:00:45.364608 containerd[1974]: time="2026-04-25T00:00:45.364353368Z" level=info msg="StartContainer for \"91f0c413e986226204f99ea7f70f7c5f98860d8f3bab8c1c4efb1724d80cf348\"" Apr 25 00:00:45.526024 systemd[1]: Started cri-containerd-91f0c413e986226204f99ea7f70f7c5f98860d8f3bab8c1c4efb1724d80cf348.scope - libcontainer container 91f0c413e986226204f99ea7f70f7c5f98860d8f3bab8c1c4efb1724d80cf348. Apr 25 00:00:45.653462 containerd[1974]: time="2026-04-25T00:00:45.652673152Z" level=info msg="StartContainer for \"91f0c413e986226204f99ea7f70f7c5f98860d8f3bab8c1c4efb1724d80cf348\" returns successfully" Apr 25 00:00:46.412154 kubelet[3188]: I0425 00:00:46.365696 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-vg777" podStartSLOduration=38.963034327 podStartE2EDuration="1m12.338122796s" podCreationTimestamp="2026-04-24 23:59:34 +0000 UTC" firstStartedPulling="2026-04-25 00:00:11.640811881 +0000 UTC m=+54.309940602" lastFinishedPulling="2026-04-25 00:00:45.01590033 +0000 UTC m=+87.685029071" observedRunningTime="2026-04-25 00:00:46.308005959 +0000 UTC m=+88.977134702" watchObservedRunningTime="2026-04-25 00:00:46.338122796 +0000 UTC m=+89.007251537" Apr 25 00:00:50.089589 systemd[1]: Started sshd@12-172.31.31.110:22-4.175.71.9:34322.service - OpenSSH per-connection server daemon (4.175.71.9:34322). Apr 25 00:00:51.138412 sshd[6416]: Accepted publickey for core from 4.175.71.9 port 34322 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:00:51.142138 sshd[6416]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:00:51.148559 systemd-logind[1954]: New session 13 of user core. Apr 25 00:00:51.155183 systemd[1]: Started session-13.scope - Session 13 of User core. Apr 25 00:00:52.019964 sshd[6416]: pam_unix(sshd:session): session closed for user core Apr 25 00:00:52.024354 systemd[1]: sshd@12-172.31.31.110:22-4.175.71.9:34322.service: Deactivated successfully. Apr 25 00:00:52.026720 systemd[1]: session-13.scope: Deactivated successfully. Apr 25 00:00:52.027873 systemd-logind[1954]: Session 13 logged out. Waiting for processes to exit. Apr 25 00:00:52.029774 systemd-logind[1954]: Removed session 13. Apr 25 00:00:53.298608 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount702570174.mount: Deactivated successfully. Apr 25 00:00:53.321662 containerd[1974]: time="2026-04-25T00:00:53.321603892Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:53.340824 containerd[1974]: time="2026-04-25T00:00:53.340746266Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Apr 25 00:00:53.342559 containerd[1974]: time="2026-04-25T00:00:53.342412579Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:53.345318 containerd[1974]: time="2026-04-25T00:00:53.345259021Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:53.346553 containerd[1974]: time="2026-04-25T00:00:53.346161777Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 8.335406142s" Apr 25 00:00:53.346553 containerd[1974]: time="2026-04-25T00:00:53.346205585Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Apr 25 00:00:53.355046 containerd[1974]: time="2026-04-25T00:00:53.354998790Z" level=info msg="CreateContainer within sandbox \"c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Apr 25 00:00:53.362684 containerd[1974]: time="2026-04-25T00:00:53.362638257Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Apr 25 00:00:53.418054 containerd[1974]: time="2026-04-25T00:00:53.418002998Z" level=info msg="CreateContainer within sandbox \"c68d3d68f333f5c0a0cb1c01289d2419b26d31e6dddf157a2a55c79369353d4d\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"a3c587428dde45292f479582cba7eadfed87ac5b811ab952b4a4f1285b1e017f\"" Apr 25 00:00:53.425181 containerd[1974]: time="2026-04-25T00:00:53.425141499Z" level=info msg="StartContainer for \"a3c587428dde45292f479582cba7eadfed87ac5b811ab952b4a4f1285b1e017f\"" Apr 25 00:00:53.466191 systemd[1]: Started cri-containerd-a3c587428dde45292f479582cba7eadfed87ac5b811ab952b4a4f1285b1e017f.scope - libcontainer container a3c587428dde45292f479582cba7eadfed87ac5b811ab952b4a4f1285b1e017f. Apr 25 00:00:53.531276 containerd[1974]: time="2026-04-25T00:00:53.531143339Z" level=info msg="StartContainer for \"a3c587428dde45292f479582cba7eadfed87ac5b811ab952b4a4f1285b1e017f\" returns successfully" Apr 25 00:00:54.271329 kubelet[3188]: I0425 00:00:54.271248 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-86bbf7df67-bhhtx" podStartSLOduration=4.995845906 podStartE2EDuration="53.270627259s" podCreationTimestamp="2026-04-25 00:00:01 +0000 UTC" firstStartedPulling="2026-04-25 00:00:05.072879929 +0000 UTC m=+47.742008660" lastFinishedPulling="2026-04-25 00:00:53.347661204 +0000 UTC m=+96.016790013" observedRunningTime="2026-04-25 00:00:54.270121491 +0000 UTC m=+96.939250232" watchObservedRunningTime="2026-04-25 00:00:54.270627259 +0000 UTC m=+96.939756003" Apr 25 00:00:55.269465 containerd[1974]: time="2026-04-25T00:00:55.269413469Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:55.270810 containerd[1974]: time="2026-04-25T00:00:55.270745480Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Apr 25 00:00:55.272772 containerd[1974]: time="2026-04-25T00:00:55.272055449Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:55.274998 containerd[1974]: time="2026-04-25T00:00:55.274928696Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:55.275930 containerd[1974]: time="2026-04-25T00:00:55.275734884Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 1.913054298s" Apr 25 00:00:55.275930 containerd[1974]: time="2026-04-25T00:00:55.275780512Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Apr 25 00:00:55.278560 containerd[1974]: time="2026-04-25T00:00:55.277866940Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 25 00:00:55.283902 containerd[1974]: time="2026-04-25T00:00:55.283718561Z" level=info msg="CreateContainer within sandbox \"24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Apr 25 00:00:55.415468 containerd[1974]: time="2026-04-25T00:00:55.415416975Z" level=info msg="CreateContainer within sandbox \"24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"33c094dd14bcb1180148bfa8d4d3ee325c898704f76b9d5ac59fedb2279523e2\"" Apr 25 00:00:55.416871 containerd[1974]: time="2026-04-25T00:00:55.416377448Z" level=info msg="StartContainer for \"33c094dd14bcb1180148bfa8d4d3ee325c898704f76b9d5ac59fedb2279523e2\"" Apr 25 00:00:55.466021 systemd[1]: Started cri-containerd-33c094dd14bcb1180148bfa8d4d3ee325c898704f76b9d5ac59fedb2279523e2.scope - libcontainer container 33c094dd14bcb1180148bfa8d4d3ee325c898704f76b9d5ac59fedb2279523e2. Apr 25 00:00:55.508764 containerd[1974]: time="2026-04-25T00:00:55.508702243Z" level=info msg="StartContainer for \"33c094dd14bcb1180148bfa8d4d3ee325c898704f76b9d5ac59fedb2279523e2\" returns successfully" Apr 25 00:00:57.200266 systemd[1]: Started sshd@13-172.31.31.110:22-4.175.71.9:57708.service - OpenSSH per-connection server daemon (4.175.71.9:57708). Apr 25 00:00:58.273430 sshd[6521]: Accepted publickey for core from 4.175.71.9 port 57708 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:00:58.279305 sshd[6521]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:00:58.286587 systemd-logind[1954]: New session 14 of user core. Apr 25 00:00:58.291996 systemd[1]: Started session-14.scope - Session 14 of User core. Apr 25 00:00:58.953935 containerd[1974]: time="2026-04-25T00:00:58.953886445Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:58.956331 containerd[1974]: time="2026-04-25T00:00:58.956255215Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Apr 25 00:00:58.958079 containerd[1974]: time="2026-04-25T00:00:58.958001886Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:58.962223 containerd[1974]: time="2026-04-25T00:00:58.962174795Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:00:58.963523 containerd[1974]: time="2026-04-25T00:00:58.963477173Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 3.685575727s" Apr 25 00:00:58.963523 containerd[1974]: time="2026-04-25T00:00:58.963520819Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Apr 25 00:00:58.967312 containerd[1974]: time="2026-04-25T00:00:58.967270019Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Apr 25 00:00:59.042501 containerd[1974]: time="2026-04-25T00:00:59.039723228Z" level=info msg="CreateContainer within sandbox \"7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 25 00:00:59.073694 containerd[1974]: time="2026-04-25T00:00:59.073552917Z" level=info msg="CreateContainer within sandbox \"7c38a3870e15b765513f3951da03a0c155d292ac3aaf771702b124c3a24da68e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5b257a3cb5e4de848e4838f1212f56d159e56eee10b034ff1f2ac89f2430b017\"" Apr 25 00:00:59.079505 containerd[1974]: time="2026-04-25T00:00:59.078113574Z" level=info msg="StartContainer for \"5b257a3cb5e4de848e4838f1212f56d159e56eee10b034ff1f2ac89f2430b017\"" Apr 25 00:00:59.083511 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3830949243.mount: Deactivated successfully. Apr 25 00:00:59.227323 systemd[1]: Started cri-containerd-5b257a3cb5e4de848e4838f1212f56d159e56eee10b034ff1f2ac89f2430b017.scope - libcontainer container 5b257a3cb5e4de848e4838f1212f56d159e56eee10b034ff1f2ac89f2430b017. Apr 25 00:00:59.302607 containerd[1974]: time="2026-04-25T00:00:59.302560275Z" level=info msg="StartContainer for \"5b257a3cb5e4de848e4838f1212f56d159e56eee10b034ff1f2ac89f2430b017\" returns successfully" Apr 25 00:00:59.928409 sshd[6521]: pam_unix(sshd:session): session closed for user core Apr 25 00:00:59.933921 systemd[1]: sshd@13-172.31.31.110:22-4.175.71.9:57708.service: Deactivated successfully. Apr 25 00:00:59.937778 systemd[1]: session-14.scope: Deactivated successfully. Apr 25 00:00:59.941278 systemd-logind[1954]: Session 14 logged out. Waiting for processes to exit. Apr 25 00:00:59.943977 systemd-logind[1954]: Removed session 14. Apr 25 00:01:00.339475 kubelet[3188]: I0425 00:01:00.327879 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-848b4486d7-dvwgq" podStartSLOduration=40.590893868 podStartE2EDuration="1m26.327859693s" podCreationTimestamp="2026-04-24 23:59:34 +0000 UTC" firstStartedPulling="2026-04-25 00:00:13.22840916 +0000 UTC m=+55.897537892" lastFinishedPulling="2026-04-25 00:00:58.965374976 +0000 UTC m=+101.634503717" observedRunningTime="2026-04-25 00:01:00.319854642 +0000 UTC m=+102.988983384" watchObservedRunningTime="2026-04-25 00:01:00.327859693 +0000 UTC m=+102.996988433" Apr 25 00:01:05.131636 systemd[1]: Started sshd@14-172.31.31.110:22-4.175.71.9:57716.service - OpenSSH per-connection server daemon (4.175.71.9:57716). Apr 25 00:01:06.275823 sshd[6624]: Accepted publickey for core from 4.175.71.9 port 57716 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:06.279988 sshd[6624]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:06.291387 systemd-logind[1954]: New session 15 of user core. Apr 25 00:01:06.297017 systemd[1]: Started session-15.scope - Session 15 of User core. Apr 25 00:01:07.862356 containerd[1974]: time="2026-04-25T00:01:07.862293084Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:01:07.865562 containerd[1974]: time="2026-04-25T00:01:07.864852656Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Apr 25 00:01:07.881889 containerd[1974]: time="2026-04-25T00:01:07.881823158Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:01:07.884906 containerd[1974]: time="2026-04-25T00:01:07.884413262Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:01:07.885590 containerd[1974]: time="2026-04-25T00:01:07.885277985Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 8.917968476s" Apr 25 00:01:07.885590 containerd[1974]: time="2026-04-25T00:01:07.885320841Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Apr 25 00:01:07.888517 sshd[6624]: pam_unix(sshd:session): session closed for user core Apr 25 00:01:07.895050 systemd[1]: sshd@14-172.31.31.110:22-4.175.71.9:57716.service: Deactivated successfully. Apr 25 00:01:07.897757 systemd[1]: session-15.scope: Deactivated successfully. Apr 25 00:01:07.907911 systemd-logind[1954]: Session 15 logged out. Waiting for processes to exit. Apr 25 00:01:07.910297 systemd-logind[1954]: Removed session 15. Apr 25 00:01:07.968061 containerd[1974]: time="2026-04-25T00:01:07.968012835Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 25 00:01:08.175829 containerd[1974]: time="2026-04-25T00:01:08.175768655Z" level=info msg="CreateContainer within sandbox \"e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Apr 25 00:01:08.208836 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1744255859.mount: Deactivated successfully. Apr 25 00:01:08.219999 containerd[1974]: time="2026-04-25T00:01:08.219513817Z" level=info msg="CreateContainer within sandbox \"e7a0549722c7a5ce69c5de57b025c65b982c9d22191f1d57e8cffdcffbc55734\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"85bfff6249d0d550ad5558aa27b0cd381fda8653f6c0ceebfbb57bf5879ac8f9\"" Apr 25 00:01:08.221610 containerd[1974]: time="2026-04-25T00:01:08.220515029Z" level=info msg="StartContainer for \"85bfff6249d0d550ad5558aa27b0cd381fda8653f6c0ceebfbb57bf5879ac8f9\"" Apr 25 00:01:08.428643 containerd[1974]: time="2026-04-25T00:01:08.426555940Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:01:08.430925 containerd[1974]: time="2026-04-25T00:01:08.429941803Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Apr 25 00:01:08.434155 containerd[1974]: time="2026-04-25T00:01:08.434100508Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 466.046936ms" Apr 25 00:01:08.434280 containerd[1974]: time="2026-04-25T00:01:08.434155988Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Apr 25 00:01:08.439093 containerd[1974]: time="2026-04-25T00:01:08.439050868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Apr 25 00:01:08.447311 containerd[1974]: time="2026-04-25T00:01:08.447209549Z" level=info msg="CreateContainer within sandbox \"6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 25 00:01:08.499633 containerd[1974]: time="2026-04-25T00:01:08.499053368Z" level=info msg="CreateContainer within sandbox \"6b59521839c1ad39593c2a08eef3b3a1bdad7fd1335bcf7dda718eabea375f68\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"980f137b362fa76c469e7e9aad6deafefa93558e99b0f7061542ec13cc81c857\"" Apr 25 00:01:08.505116 containerd[1974]: time="2026-04-25T00:01:08.505079790Z" level=info msg="StartContainer for \"980f137b362fa76c469e7e9aad6deafefa93558e99b0f7061542ec13cc81c857\"" Apr 25 00:01:08.508051 systemd[1]: Started cri-containerd-85bfff6249d0d550ad5558aa27b0cd381fda8653f6c0ceebfbb57bf5879ac8f9.scope - libcontainer container 85bfff6249d0d550ad5558aa27b0cd381fda8653f6c0ceebfbb57bf5879ac8f9. Apr 25 00:01:08.582406 systemd[1]: Started cri-containerd-980f137b362fa76c469e7e9aad6deafefa93558e99b0f7061542ec13cc81c857.scope - libcontainer container 980f137b362fa76c469e7e9aad6deafefa93558e99b0f7061542ec13cc81c857. Apr 25 00:01:08.626953 containerd[1974]: time="2026-04-25T00:01:08.626708549Z" level=info msg="StartContainer for \"85bfff6249d0d550ad5558aa27b0cd381fda8653f6c0ceebfbb57bf5879ac8f9\" returns successfully" Apr 25 00:01:08.672746 containerd[1974]: time="2026-04-25T00:01:08.672683372Z" level=info msg="StartContainer for \"980f137b362fa76c469e7e9aad6deafefa93558e99b0f7061542ec13cc81c857\" returns successfully" Apr 25 00:01:09.571736 systemd[1]: run-containerd-runc-k8s.io-85bfff6249d0d550ad5558aa27b0cd381fda8653f6c0ceebfbb57bf5879ac8f9-runc.IOs68F.mount: Deactivated successfully. Apr 25 00:01:09.706062 kubelet[3188]: I0425 00:01:09.705961 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7564df694c-4d42l" podStartSLOduration=40.859070411 podStartE2EDuration="1m34.659329442s" podCreationTimestamp="2026-04-24 23:59:35 +0000 UTC" firstStartedPulling="2026-04-25 00:00:14.128691378 +0000 UTC m=+56.797820096" lastFinishedPulling="2026-04-25 00:01:07.928950396 +0000 UTC m=+110.598079127" observedRunningTime="2026-04-25 00:01:09.655953324 +0000 UTC m=+112.325082066" watchObservedRunningTime="2026-04-25 00:01:09.659329442 +0000 UTC m=+112.328458185" Apr 25 00:01:09.746156 kubelet[3188]: I0425 00:01:09.746086 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-848b4486d7-brf5p" podStartSLOduration=42.323888199 podStartE2EDuration="1m35.746069396s" podCreationTimestamp="2026-04-24 23:59:34 +0000 UTC" firstStartedPulling="2026-04-25 00:00:15.012932346 +0000 UTC m=+57.682061065" lastFinishedPulling="2026-04-25 00:01:08.435113544 +0000 UTC m=+111.104242262" observedRunningTime="2026-04-25 00:01:09.707802468 +0000 UTC m=+112.376931213" watchObservedRunningTime="2026-04-25 00:01:09.746069396 +0000 UTC m=+112.415198141" Apr 25 00:01:13.064425 systemd[1]: Started sshd@15-172.31.31.110:22-4.175.71.9:56212.service - OpenSSH per-connection server daemon (4.175.71.9:56212). Apr 25 00:01:14.159423 sshd[6753]: Accepted publickey for core from 4.175.71.9 port 56212 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:14.164027 sshd[6753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:14.169854 systemd-logind[1954]: New session 16 of user core. Apr 25 00:01:14.178062 systemd[1]: Started session-16.scope - Session 16 of User core. Apr 25 00:01:15.817081 sshd[6753]: pam_unix(sshd:session): session closed for user core Apr 25 00:01:15.824322 systemd-logind[1954]: Session 16 logged out. Waiting for processes to exit. Apr 25 00:01:15.824947 systemd[1]: sshd@15-172.31.31.110:22-4.175.71.9:56212.service: Deactivated successfully. Apr 25 00:01:15.829160 systemd[1]: session-16.scope: Deactivated successfully. Apr 25 00:01:15.832376 systemd-logind[1954]: Removed session 16. Apr 25 00:01:16.013023 systemd[1]: Started sshd@16-172.31.31.110:22-4.175.71.9:43392.service - OpenSSH per-connection server daemon (4.175.71.9:43392). Apr 25 00:01:16.174247 containerd[1974]: time="2026-04-25T00:01:16.174194474Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:01:16.175649 containerd[1974]: time="2026-04-25T00:01:16.175502450Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Apr 25 00:01:16.178154 containerd[1974]: time="2026-04-25T00:01:16.177107280Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:01:16.179820 containerd[1974]: time="2026-04-25T00:01:16.179727597Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 25 00:01:16.180674 containerd[1974]: time="2026-04-25T00:01:16.180496078Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 7.741391743s" Apr 25 00:01:16.180674 containerd[1974]: time="2026-04-25T00:01:16.180538055Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Apr 25 00:01:16.188228 containerd[1974]: time="2026-04-25T00:01:16.188185441Z" level=info msg="CreateContainer within sandbox \"24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Apr 25 00:01:16.237823 containerd[1974]: time="2026-04-25T00:01:16.236718554Z" level=info msg="CreateContainer within sandbox \"24e481870027e26f76e7f978298c1ee774e94009de6f1cb58cda6ee685d34567\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"8600bcbcac71bbdaa2e2c1c3b8bf6bb517a32153abfa07d047859f361d917869\"" Apr 25 00:01:16.240462 containerd[1974]: time="2026-04-25T00:01:16.240421306Z" level=info msg="StartContainer for \"8600bcbcac71bbdaa2e2c1c3b8bf6bb517a32153abfa07d047859f361d917869\"" Apr 25 00:01:16.337901 systemd[1]: run-containerd-runc-k8s.io-8600bcbcac71bbdaa2e2c1c3b8bf6bb517a32153abfa07d047859f361d917869-runc.giN0ji.mount: Deactivated successfully. Apr 25 00:01:16.347214 systemd[1]: Started cri-containerd-8600bcbcac71bbdaa2e2c1c3b8bf6bb517a32153abfa07d047859f361d917869.scope - libcontainer container 8600bcbcac71bbdaa2e2c1c3b8bf6bb517a32153abfa07d047859f361d917869. Apr 25 00:01:16.405011 containerd[1974]: time="2026-04-25T00:01:16.404959252Z" level=info msg="StartContainer for \"8600bcbcac71bbdaa2e2c1c3b8bf6bb517a32153abfa07d047859f361d917869\" returns successfully" Apr 25 00:01:16.638122 kubelet[3188]: I0425 00:01:16.637923 3188 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-wllpb" podStartSLOduration=37.92176939 podStartE2EDuration="1m41.637900684s" podCreationTimestamp="2026-04-24 23:59:35 +0000 UTC" firstStartedPulling="2026-04-25 00:00:12.465565252 +0000 UTC m=+55.134693984" lastFinishedPulling="2026-04-25 00:01:16.18169654 +0000 UTC m=+118.850825278" observedRunningTime="2026-04-25 00:01:16.63697384 +0000 UTC m=+119.306102581" watchObservedRunningTime="2026-04-25 00:01:16.637900684 +0000 UTC m=+119.307029425" Apr 25 00:01:17.026648 kubelet[3188]: I0425 00:01:17.022548 3188 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Apr 25 00:01:17.027949 kubelet[3188]: I0425 00:01:17.026684 3188 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Apr 25 00:01:17.114072 sshd[6803]: Accepted publickey for core from 4.175.71.9 port 43392 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:17.125232 sshd[6803]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:17.134997 systemd-logind[1954]: New session 17 of user core. Apr 25 00:01:17.140019 systemd[1]: Started session-17.scope - Session 17 of User core. Apr 25 00:01:18.478356 sshd[6803]: pam_unix(sshd:session): session closed for user core Apr 25 00:01:18.484384 systemd[1]: sshd@16-172.31.31.110:22-4.175.71.9:43392.service: Deactivated successfully. Apr 25 00:01:18.487945 systemd[1]: session-17.scope: Deactivated successfully. Apr 25 00:01:18.490548 systemd-logind[1954]: Session 17 logged out. Waiting for processes to exit. Apr 25 00:01:18.492416 systemd-logind[1954]: Removed session 17. Apr 25 00:01:18.647898 systemd[1]: Started sshd@17-172.31.31.110:22-4.175.71.9:43408.service - OpenSSH per-connection server daemon (4.175.71.9:43408). Apr 25 00:01:19.764940 sshd[6872]: Accepted publickey for core from 4.175.71.9 port 43408 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:19.770365 sshd[6872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:19.775991 systemd-logind[1954]: New session 18 of user core. Apr 25 00:01:19.779987 systemd[1]: Started session-18.scope - Session 18 of User core. Apr 25 00:01:20.798465 sshd[6872]: pam_unix(sshd:session): session closed for user core Apr 25 00:01:20.808222 systemd[1]: sshd@17-172.31.31.110:22-4.175.71.9:43408.service: Deactivated successfully. Apr 25 00:01:20.810604 systemd[1]: session-18.scope: Deactivated successfully. Apr 25 00:01:20.811930 systemd-logind[1954]: Session 18 logged out. Waiting for processes to exit. Apr 25 00:01:20.815568 systemd-logind[1954]: Removed session 18. Apr 25 00:01:21.210868 systemd[1]: run-containerd-runc-k8s.io-91f0c413e986226204f99ea7f70f7c5f98860d8f3bab8c1c4efb1724d80cf348-runc.ygNUXO.mount: Deactivated successfully. Apr 25 00:01:25.980144 systemd[1]: Started sshd@18-172.31.31.110:22-4.175.71.9:52716.service - OpenSSH per-connection server daemon (4.175.71.9:52716). Apr 25 00:01:27.074271 sshd[6918]: Accepted publickey for core from 4.175.71.9 port 52716 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:27.078673 sshd[6918]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:27.086504 systemd-logind[1954]: New session 19 of user core. Apr 25 00:01:27.091085 systemd[1]: Started session-19.scope - Session 19 of User core. Apr 25 00:01:28.461075 sshd[6918]: pam_unix(sshd:session): session closed for user core Apr 25 00:01:28.465165 systemd[1]: sshd@18-172.31.31.110:22-4.175.71.9:52716.service: Deactivated successfully. Apr 25 00:01:28.467426 systemd[1]: session-19.scope: Deactivated successfully. Apr 25 00:01:28.469281 systemd-logind[1954]: Session 19 logged out. Waiting for processes to exit. Apr 25 00:01:28.470693 systemd-logind[1954]: Removed session 19. Apr 25 00:01:28.633231 systemd[1]: Started sshd@19-172.31.31.110:22-4.175.71.9:52730.service - OpenSSH per-connection server daemon (4.175.71.9:52730). Apr 25 00:01:29.626268 sshd[6931]: Accepted publickey for core from 4.175.71.9 port 52730 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:29.628231 sshd[6931]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:29.632868 systemd-logind[1954]: New session 20 of user core. Apr 25 00:01:29.639106 systemd[1]: Started session-20.scope - Session 20 of User core. Apr 25 00:01:30.851627 sshd[6931]: pam_unix(sshd:session): session closed for user core Apr 25 00:01:30.859648 systemd[1]: sshd@19-172.31.31.110:22-4.175.71.9:52730.service: Deactivated successfully. Apr 25 00:01:30.863927 systemd[1]: session-20.scope: Deactivated successfully. Apr 25 00:01:30.865814 systemd-logind[1954]: Session 20 logged out. Waiting for processes to exit. Apr 25 00:01:30.867575 systemd-logind[1954]: Removed session 20. Apr 25 00:01:30.985803 systemd[1]: run-containerd-runc-k8s.io-19b22f5fcafa74b196e5a6566598d3258ab5724f41bf8ad07aa927cccda220a2-runc.KD3Hsv.mount: Deactivated successfully. Apr 25 00:01:31.032248 systemd[1]: Started sshd@20-172.31.31.110:22-4.175.71.9:52736.service - OpenSSH per-connection server daemon (4.175.71.9:52736). Apr 25 00:01:32.087832 sshd[6958]: Accepted publickey for core from 4.175.71.9 port 52736 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:32.091438 sshd[6958]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:32.098951 systemd-logind[1954]: New session 21 of user core. Apr 25 00:01:32.105013 systemd[1]: Started session-21.scope - Session 21 of User core. Apr 25 00:01:33.556843 sshd[6958]: pam_unix(sshd:session): session closed for user core Apr 25 00:01:33.565006 systemd[1]: sshd@20-172.31.31.110:22-4.175.71.9:52736.service: Deactivated successfully. Apr 25 00:01:33.567849 systemd[1]: session-21.scope: Deactivated successfully. Apr 25 00:01:33.569826 systemd-logind[1954]: Session 21 logged out. Waiting for processes to exit. Apr 25 00:01:33.571385 systemd-logind[1954]: Removed session 21. Apr 25 00:01:33.725769 systemd[1]: Started sshd@21-172.31.31.110:22-4.175.71.9:52750.service - OpenSSH per-connection server daemon (4.175.71.9:52750). Apr 25 00:01:34.719661 sshd[6991]: Accepted publickey for core from 4.175.71.9 port 52750 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:34.720352 sshd[6991]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:34.726037 systemd-logind[1954]: New session 22 of user core. Apr 25 00:01:34.734043 systemd[1]: Started session-22.scope - Session 22 of User core. Apr 25 00:01:36.500113 sshd[6991]: pam_unix(sshd:session): session closed for user core Apr 25 00:01:36.506116 systemd-logind[1954]: Session 22 logged out. Waiting for processes to exit. Apr 25 00:01:36.507675 systemd[1]: sshd@21-172.31.31.110:22-4.175.71.9:52750.service: Deactivated successfully. Apr 25 00:01:36.510323 systemd[1]: session-22.scope: Deactivated successfully. Apr 25 00:01:36.511640 systemd-logind[1954]: Removed session 22. Apr 25 00:01:36.668112 systemd[1]: Started sshd@22-172.31.31.110:22-4.175.71.9:46780.service - OpenSSH per-connection server daemon (4.175.71.9:46780). Apr 25 00:01:37.676819 sshd[7014]: Accepted publickey for core from 4.175.71.9 port 46780 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:37.679462 sshd[7014]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:37.686098 systemd-logind[1954]: New session 23 of user core. Apr 25 00:01:37.691015 systemd[1]: Started session-23.scope - Session 23 of User core. Apr 25 00:01:38.455093 sshd[7014]: pam_unix(sshd:session): session closed for user core Apr 25 00:01:38.459027 systemd[1]: sshd@22-172.31.31.110:22-4.175.71.9:46780.service: Deactivated successfully. Apr 25 00:01:38.461408 systemd[1]: session-23.scope: Deactivated successfully. Apr 25 00:01:38.463557 systemd-logind[1954]: Session 23 logged out. Waiting for processes to exit. Apr 25 00:01:38.465415 systemd-logind[1954]: Removed session 23. Apr 25 00:01:43.626238 systemd[1]: Started sshd@23-172.31.31.110:22-4.175.71.9:46792.service - OpenSSH per-connection server daemon (4.175.71.9:46792). Apr 25 00:01:44.661917 sshd[7070]: Accepted publickey for core from 4.175.71.9 port 46792 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:44.662619 sshd[7070]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:44.668155 systemd-logind[1954]: New session 24 of user core. Apr 25 00:01:44.672022 systemd[1]: Started session-24.scope - Session 24 of User core. Apr 25 00:01:45.670092 sshd[7070]: pam_unix(sshd:session): session closed for user core Apr 25 00:01:45.676402 systemd-logind[1954]: Session 24 logged out. Waiting for processes to exit. Apr 25 00:01:45.677654 systemd[1]: sshd@23-172.31.31.110:22-4.175.71.9:46792.service: Deactivated successfully. Apr 25 00:01:45.680600 systemd[1]: session-24.scope: Deactivated successfully. Apr 25 00:01:45.682235 systemd-logind[1954]: Removed session 24. Apr 25 00:01:50.835028 systemd[1]: Started sshd@24-172.31.31.110:22-4.175.71.9:50714.service - OpenSSH per-connection server daemon (4.175.71.9:50714). Apr 25 00:01:51.867276 sshd[7115]: Accepted publickey for core from 4.175.71.9 port 50714 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:51.870342 sshd[7115]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:51.882513 systemd-logind[1954]: New session 25 of user core. Apr 25 00:01:51.886998 systemd[1]: Started session-25.scope - Session 25 of User core. Apr 25 00:01:53.128627 sshd[7115]: pam_unix(sshd:session): session closed for user core Apr 25 00:01:53.133625 systemd[1]: sshd@24-172.31.31.110:22-4.175.71.9:50714.service: Deactivated successfully. Apr 25 00:01:53.133651 systemd-logind[1954]: Session 25 logged out. Waiting for processes to exit. Apr 25 00:01:53.137266 systemd[1]: session-25.scope: Deactivated successfully. Apr 25 00:01:53.139537 systemd-logind[1954]: Removed session 25. Apr 25 00:01:58.310085 systemd[1]: Started sshd@25-172.31.31.110:22-4.175.71.9:33990.service - OpenSSH per-connection server daemon (4.175.71.9:33990). Apr 25 00:01:59.332843 sshd[7130]: Accepted publickey for core from 4.175.71.9 port 33990 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:01:59.334134 sshd[7130]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:01:59.339957 systemd-logind[1954]: New session 26 of user core. Apr 25 00:01:59.346327 systemd[1]: Started session-26.scope - Session 26 of User core. Apr 25 00:02:00.448682 sshd[7130]: pam_unix(sshd:session): session closed for user core Apr 25 00:02:00.453651 systemd-logind[1954]: Session 26 logged out. Waiting for processes to exit. Apr 25 00:02:00.454296 systemd[1]: sshd@25-172.31.31.110:22-4.175.71.9:33990.service: Deactivated successfully. Apr 25 00:02:00.457522 systemd[1]: session-26.scope: Deactivated successfully. Apr 25 00:02:00.459158 systemd-logind[1954]: Removed session 26. Apr 25 00:02:05.619144 systemd[1]: Started sshd@26-172.31.31.110:22-4.175.71.9:37620.service - OpenSSH per-connection server daemon (4.175.71.9:37620). Apr 25 00:02:06.670311 sshd[7164]: Accepted publickey for core from 4.175.71.9 port 37620 ssh2: RSA SHA256:5HhJ2X4iOQfF5HWKIEVpWTPXYo3rjlnxoO1NrD+aEDg Apr 25 00:02:06.675026 sshd[7164]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 25 00:02:06.681670 systemd-logind[1954]: New session 27 of user core. Apr 25 00:02:06.684008 systemd[1]: Started session-27.scope - Session 27 of User core. Apr 25 00:02:08.144647 sshd[7164]: pam_unix(sshd:session): session closed for user core Apr 25 00:02:08.148986 systemd[1]: sshd@26-172.31.31.110:22-4.175.71.9:37620.service: Deactivated successfully. Apr 25 00:02:08.151628 systemd[1]: session-27.scope: Deactivated successfully. Apr 25 00:02:08.153694 systemd-logind[1954]: Session 27 logged out. Waiting for processes to exit. Apr 25 00:02:08.155844 systemd-logind[1954]: Removed session 27. Apr 25 00:02:17.182546 systemd[1]: run-containerd-runc-k8s.io-85bfff6249d0d550ad5558aa27b0cd381fda8653f6c0ceebfbb57bf5879ac8f9-runc.UDtcvo.mount: Deactivated successfully. Apr 25 00:02:30.989972 systemd[1]: run-containerd-runc-k8s.io-19b22f5fcafa74b196e5a6566598d3258ab5724f41bf8ad07aa927cccda220a2-runc.cQu1C3.mount: Deactivated successfully. Apr 25 00:02:39.607649 systemd[1]: run-containerd-runc-k8s.io-85bfff6249d0d550ad5558aa27b0cd381fda8653f6c0ceebfbb57bf5879ac8f9-runc.8qrcw0.mount: Deactivated successfully. Apr 25 00:02:57.894037 systemd[1]: cri-containerd-c418045851606f1ef65eca56b3572f07751409a908e3af94b9f106a0411550d9.scope: Deactivated successfully. Apr 25 00:02:57.894348 systemd[1]: cri-containerd-c418045851606f1ef65eca56b3572f07751409a908e3af94b9f106a0411550d9.scope: Consumed 3.783s CPU time, 16.0M memory peak, 0B memory swap peak. Apr 25 00:02:58.212324 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c418045851606f1ef65eca56b3572f07751409a908e3af94b9f106a0411550d9-rootfs.mount: Deactivated successfully. Apr 25 00:02:58.345295 containerd[1974]: time="2026-04-25T00:02:58.268276390Z" level=info msg="shim disconnected" id=c418045851606f1ef65eca56b3572f07751409a908e3af94b9f106a0411550d9 namespace=k8s.io Apr 25 00:02:58.345295 containerd[1974]: time="2026-04-25T00:02:58.345296023Z" level=warning msg="cleaning up after shim disconnected" id=c418045851606f1ef65eca56b3572f07751409a908e3af94b9f106a0411550d9 namespace=k8s.io Apr 25 00:02:58.345295 containerd[1974]: time="2026-04-25T00:02:58.345317341Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 25 00:02:58.774316 systemd[1]: cri-containerd-5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6.scope: Deactivated successfully. Apr 25 00:02:58.775238 systemd[1]: cri-containerd-5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6.scope: Consumed 8.203s CPU time. Apr 25 00:02:58.799986 containerd[1974]: time="2026-04-25T00:02:58.799915193Z" level=info msg="shim disconnected" id=5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6 namespace=k8s.io Apr 25 00:02:58.799986 containerd[1974]: time="2026-04-25T00:02:58.799986737Z" level=warning msg="cleaning up after shim disconnected" id=5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6 namespace=k8s.io Apr 25 00:02:58.800281 containerd[1974]: time="2026-04-25T00:02:58.799998565Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 25 00:02:58.802602 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6-rootfs.mount: Deactivated successfully. Apr 25 00:02:59.402191 kubelet[3188]: I0425 00:02:59.393316 3188 scope.go:122] "RemoveContainer" containerID="5ac310361b7b1fccffe86a33a8f822fc67184c861964a9fad6873e4268348be6" Apr 25 00:02:59.406195 kubelet[3188]: I0425 00:02:59.402911 3188 scope.go:122] "RemoveContainer" containerID="c418045851606f1ef65eca56b3572f07751409a908e3af94b9f106a0411550d9" Apr 25 00:02:59.525959 containerd[1974]: time="2026-04-25T00:02:59.525909108Z" level=info msg="CreateContainer within sandbox \"649415c480f9cbf9c235f11a830acda218ee254dbb0765dd5d461880729993aa\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Apr 25 00:02:59.531930 containerd[1974]: time="2026-04-25T00:02:59.530303965Z" level=info msg="CreateContainer within sandbox \"f8e424934444519693af946c7b498f7378a7c57527eea5f714afb93db5a846fc\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Apr 25 00:02:59.676842 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount927435402.mount: Deactivated successfully. Apr 25 00:02:59.686274 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2757498339.mount: Deactivated successfully. Apr 25 00:02:59.717157 containerd[1974]: time="2026-04-25T00:02:59.717098233Z" level=info msg="CreateContainer within sandbox \"f8e424934444519693af946c7b498f7378a7c57527eea5f714afb93db5a846fc\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"eee30041f2d511c2c9203bba7a1b412c444155b7e025b6dcf5ea413c036d6a04\"" Apr 25 00:02:59.722265 containerd[1974]: time="2026-04-25T00:02:59.722219808Z" level=info msg="StartContainer for \"eee30041f2d511c2c9203bba7a1b412c444155b7e025b6dcf5ea413c036d6a04\"" Apr 25 00:02:59.725835 containerd[1974]: time="2026-04-25T00:02:59.724121754Z" level=info msg="CreateContainer within sandbox \"649415c480f9cbf9c235f11a830acda218ee254dbb0765dd5d461880729993aa\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"b8ac7263b2c36ac2d993974031e07644063ca7c6720a9ec7e8acada761ff8e98\"" Apr 25 00:02:59.725835 containerd[1974]: time="2026-04-25T00:02:59.724748652Z" level=info msg="StartContainer for \"b8ac7263b2c36ac2d993974031e07644063ca7c6720a9ec7e8acada761ff8e98\"" Apr 25 00:02:59.807141 systemd[1]: Started cri-containerd-b8ac7263b2c36ac2d993974031e07644063ca7c6720a9ec7e8acada761ff8e98.scope - libcontainer container b8ac7263b2c36ac2d993974031e07644063ca7c6720a9ec7e8acada761ff8e98. Apr 25 00:02:59.808406 systemd[1]: Started cri-containerd-eee30041f2d511c2c9203bba7a1b412c444155b7e025b6dcf5ea413c036d6a04.scope - libcontainer container eee30041f2d511c2c9203bba7a1b412c444155b7e025b6dcf5ea413c036d6a04. Apr 25 00:02:59.904777 containerd[1974]: time="2026-04-25T00:02:59.904732053Z" level=info msg="StartContainer for \"b8ac7263b2c36ac2d993974031e07644063ca7c6720a9ec7e8acada761ff8e98\" returns successfully" Apr 25 00:02:59.904777 containerd[1974]: time="2026-04-25T00:02:59.904732093Z" level=info msg="StartContainer for \"eee30041f2d511c2c9203bba7a1b412c444155b7e025b6dcf5ea413c036d6a04\" returns successfully" Apr 25 00:03:01.219896 kubelet[3188]: E0425 00:03:01.219846 3188 controller.go:251] "Failed to update lease" err="Put \"https://172.31.31.110:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-31-110?timeout=10s\": context deadline exceeded" Apr 25 00:03:02.549155 systemd[1]: cri-containerd-6b89e68433ed79e525e9aa4d163517e371c703cc868aa6e33df1b161bcf93be9.scope: Deactivated successfully. Apr 25 00:03:02.549452 systemd[1]: cri-containerd-6b89e68433ed79e525e9aa4d163517e371c703cc868aa6e33df1b161bcf93be9.scope: Consumed 1.981s CPU time, 14.2M memory peak, 0B memory swap peak. Apr 25 00:03:02.586927 containerd[1974]: time="2026-04-25T00:03:02.586577974Z" level=info msg="shim disconnected" id=6b89e68433ed79e525e9aa4d163517e371c703cc868aa6e33df1b161bcf93be9 namespace=k8s.io Apr 25 00:03:02.586927 containerd[1974]: time="2026-04-25T00:03:02.586661099Z" level=warning msg="cleaning up after shim disconnected" id=6b89e68433ed79e525e9aa4d163517e371c703cc868aa6e33df1b161bcf93be9 namespace=k8s.io Apr 25 00:03:02.586927 containerd[1974]: time="2026-04-25T00:03:02.586674471Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 25 00:03:02.594700 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6b89e68433ed79e525e9aa4d163517e371c703cc868aa6e33df1b161bcf93be9-rootfs.mount: Deactivated successfully. Apr 25 00:03:03.352259 kubelet[3188]: I0425 00:03:03.352230 3188 scope.go:122] "RemoveContainer" containerID="6b89e68433ed79e525e9aa4d163517e371c703cc868aa6e33df1b161bcf93be9" Apr 25 00:03:03.355111 containerd[1974]: time="2026-04-25T00:03:03.355068896Z" level=info msg="CreateContainer within sandbox \"5dd646876710cede4cb79e81db6e93617b9c6753564ad2d2f6ff070bc89e8252\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Apr 25 00:03:03.388652 containerd[1974]: time="2026-04-25T00:03:03.388600954Z" level=info msg="CreateContainer within sandbox \"5dd646876710cede4cb79e81db6e93617b9c6753564ad2d2f6ff070bc89e8252\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"c9f008b0a33fee09e855504c8f54aecee1b496d71e4be40c38643ad6858c21df\"" Apr 25 00:03:03.389449 containerd[1974]: time="2026-04-25T00:03:03.389188869Z" level=info msg="StartContainer for \"c9f008b0a33fee09e855504c8f54aecee1b496d71e4be40c38643ad6858c21df\"" Apr 25 00:03:03.430023 systemd[1]: Started cri-containerd-c9f008b0a33fee09e855504c8f54aecee1b496d71e4be40c38643ad6858c21df.scope - libcontainer container c9f008b0a33fee09e855504c8f54aecee1b496d71e4be40c38643ad6858c21df. Apr 25 00:03:03.479225 containerd[1974]: time="2026-04-25T00:03:03.479162604Z" level=info msg="StartContainer for \"c9f008b0a33fee09e855504c8f54aecee1b496d71e4be40c38643ad6858c21df\" returns successfully" Apr 25 00:03:09.539212 systemd[1]: run-containerd-runc-k8s.io-85bfff6249d0d550ad5558aa27b0cd381fda8653f6c0ceebfbb57bf5879ac8f9-runc.54Qov5.mount: Deactivated successfully.