Apr 17 23:44:22.141075 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Apr 17 22:11:20 -00 2026 Apr 17 23:44:22.141103 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=e69cfa144bf8cf6f0b7e7881c91c17228ba9dbcb6c99d9692bced9ddba34ee3a Apr 17 23:44:22.141118 kernel: BIOS-provided physical RAM map: Apr 17 23:44:22.141125 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Apr 17 23:44:22.141131 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved Apr 17 23:44:22.141137 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000000437dfff] usable Apr 17 23:44:22.141148 kernel: BIOS-e820: [mem 0x000000000437e000-0x000000000477dfff] reserved Apr 17 23:44:22.141155 kernel: BIOS-e820: [mem 0x000000000477e000-0x000000003ff1efff] usable Apr 17 23:44:22.141163 kernel: BIOS-e820: [mem 0x000000003ff1f000-0x000000003ff73fff] type 20 Apr 17 23:44:22.141174 kernel: BIOS-e820: [mem 0x000000003ff74000-0x000000003ffc8fff] reserved Apr 17 23:44:22.141181 kernel: BIOS-e820: [mem 0x000000003ffc9000-0x000000003fffafff] ACPI data Apr 17 23:44:22.141187 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS Apr 17 23:44:22.141195 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable Apr 17 23:44:22.141204 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable Apr 17 23:44:22.141214 kernel: printk: bootconsole [earlyser0] enabled Apr 17 23:44:22.141223 kernel: NX (Execute Disable) protection: active Apr 17 23:44:22.141232 kernel: APIC: Static calls initialized Apr 17 23:44:22.141239 kernel: efi: EFI v2.7 by Microsoft Apr 17 23:44:22.141246 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff88000 SMBIOS 3.0=0x3ff86000 MEMATTR=0x3f41e418 Apr 17 23:44:22.141257 kernel: SMBIOS 3.1.0 present. Apr 17 23:44:22.141264 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 01/08/2026 Apr 17 23:44:22.141271 kernel: Hypervisor detected: Microsoft Hyper-V Apr 17 23:44:22.141282 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x64e24, misc 0xbed7b2 Apr 17 23:44:22.141289 kernel: Hyper-V: Host Build 10.0.26102.1277-1-0 Apr 17 23:44:22.141296 kernel: Hyper-V: Nested features: 0x1e0101 Apr 17 23:44:22.141309 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 Apr 17 23:44:22.141316 kernel: Hyper-V: Using hypercall for remote TLB flush Apr 17 23:44:22.141323 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Apr 17 23:44:22.141333 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Apr 17 23:44:22.141342 kernel: tsc: Marking TSC unstable due to running on Hyper-V Apr 17 23:44:22.141349 kernel: tsc: Detected 2593.905 MHz processor Apr 17 23:44:22.141357 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Apr 17 23:44:22.141368 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Apr 17 23:44:22.141375 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x400000000 Apr 17 23:44:22.141389 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Apr 17 23:44:22.141396 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Apr 17 23:44:22.141408 kernel: e820: update [mem 0x40000000-0xffffffff] usable ==> reserved Apr 17 23:44:22.141415 kernel: last_pfn = 0x40000 max_arch_pfn = 0x400000000 Apr 17 23:44:22.141422 kernel: Using GB pages for direct mapping Apr 17 23:44:22.141433 kernel: Secure boot disabled Apr 17 23:44:22.141444 kernel: ACPI: Early table checksum verification disabled Apr 17 23:44:22.141458 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) Apr 17 23:44:22.141466 kernel: ACPI: XSDT 0x000000003FFF90E8 00005C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141474 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141484 kernel: ACPI: DSDT 0x000000003FFD6000 01E22B (v02 MSFTVM DSDT01 00000001 INTL 20230628) Apr 17 23:44:22.141493 kernel: ACPI: FACS 0x000000003FFFE000 000040 Apr 17 23:44:22.141501 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141509 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141522 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141530 kernel: ACPI: APIC 0x000000003FFD5000 000058 (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141541 kernel: ACPI: SRAT 0x000000003FFD4000 0001E0 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141549 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141557 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] Apr 17 23:44:22.141568 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff422a] Apr 17 23:44:22.141576 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] Apr 17 23:44:22.141583 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] Apr 17 23:44:22.141594 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] Apr 17 23:44:22.141605 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] Apr 17 23:44:22.141612 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5057] Apr 17 23:44:22.141624 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd41df] Apr 17 23:44:22.141632 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] Apr 17 23:44:22.141639 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Apr 17 23:44:22.141651 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Apr 17 23:44:22.141659 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Apr 17 23:44:22.141666 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] hotplug Apr 17 23:44:22.141678 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x2c0000000-0xfdfffffff] hotplug Apr 17 23:44:22.141688 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Apr 17 23:44:22.141699 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Apr 17 23:44:22.141707 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Apr 17 23:44:22.141719 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Apr 17 23:44:22.141726 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Apr 17 23:44:22.141734 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Apr 17 23:44:22.141745 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Apr 17 23:44:22.141753 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00000000-0x2bfffffff] Apr 17 23:44:22.141766 kernel: NODE_DATA(0) allocated [mem 0x2bfffa000-0x2bfffffff] Apr 17 23:44:22.141775 kernel: Zone ranges: Apr 17 23:44:22.141786 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Apr 17 23:44:22.141794 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Apr 17 23:44:22.141806 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] Apr 17 23:44:22.141816 kernel: Movable zone start for each node Apr 17 23:44:22.141828 kernel: Early memory node ranges Apr 17 23:44:22.141841 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Apr 17 23:44:22.141856 kernel: node 0: [mem 0x0000000000100000-0x000000000437dfff] Apr 17 23:44:22.141880 kernel: node 0: [mem 0x000000000477e000-0x000000003ff1efff] Apr 17 23:44:22.141912 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] Apr 17 23:44:22.141929 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] Apr 17 23:44:22.141946 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] Apr 17 23:44:22.141963 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Apr 17 23:44:22.141977 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Apr 17 23:44:22.141991 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Apr 17 23:44:22.142005 kernel: On node 0, zone DMA32: 224 pages in unavailable ranges Apr 17 23:44:22.142019 kernel: ACPI: PM-Timer IO Port: 0x408 Apr 17 23:44:22.142038 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] dfl dfl lint[0x1]) Apr 17 23:44:22.142053 kernel: IOAPIC[0]: apic_id 2, version 17, address 0xfec00000, GSI 0-23 Apr 17 23:44:22.142069 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Apr 17 23:44:22.142085 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Apr 17 23:44:22.142101 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 Apr 17 23:44:22.142117 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Apr 17 23:44:22.142133 kernel: [mem 0x40000000-0xffffffff] available for PCI devices Apr 17 23:44:22.142147 kernel: Booting paravirtualized kernel on Hyper-V Apr 17 23:44:22.142162 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Apr 17 23:44:22.142181 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Apr 17 23:44:22.142199 kernel: percpu: Embedded 57 pages/cpu s196328 r8192 d28952 u1048576 Apr 17 23:44:22.142213 kernel: pcpu-alloc: s196328 r8192 d28952 u1048576 alloc=1*2097152 Apr 17 23:44:22.142228 kernel: pcpu-alloc: [0] 0 1 Apr 17 23:44:22.142242 kernel: Hyper-V: PV spinlocks enabled Apr 17 23:44:22.142257 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Apr 17 23:44:22.142273 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=e69cfa144bf8cf6f0b7e7881c91c17228ba9dbcb6c99d9692bced9ddba34ee3a Apr 17 23:44:22.142288 kernel: random: crng init done Apr 17 23:44:22.142308 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Apr 17 23:44:22.142324 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Apr 17 23:44:22.142341 kernel: Fallback order for Node 0: 0 Apr 17 23:44:22.142355 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2061321 Apr 17 23:44:22.142370 kernel: Policy zone: Normal Apr 17 23:44:22.142386 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Apr 17 23:44:22.142402 kernel: software IO TLB: area num 2. Apr 17 23:44:22.142419 kernel: Memory: 8066036K/8383228K available (12288K kernel code, 2288K rwdata, 22752K rodata, 42892K init, 2304K bss, 316932K reserved, 0K cma-reserved) Apr 17 23:44:22.142436 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Apr 17 23:44:22.142475 kernel: ftrace: allocating 37996 entries in 149 pages Apr 17 23:44:22.142492 kernel: ftrace: allocated 149 pages with 4 groups Apr 17 23:44:22.142511 kernel: Dynamic Preempt: voluntary Apr 17 23:44:22.142534 kernel: rcu: Preemptible hierarchical RCU implementation. Apr 17 23:44:22.142551 kernel: rcu: RCU event tracing is enabled. Apr 17 23:44:22.142564 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Apr 17 23:44:22.142577 kernel: Trampoline variant of Tasks RCU enabled. Apr 17 23:44:22.142589 kernel: Rude variant of Tasks RCU enabled. Apr 17 23:44:22.142601 kernel: Tracing variant of Tasks RCU enabled. Apr 17 23:44:22.142617 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Apr 17 23:44:22.142629 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Apr 17 23:44:22.142643 kernel: Using NULL legacy PIC Apr 17 23:44:22.142657 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 Apr 17 23:44:22.142672 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Apr 17 23:44:22.142685 kernel: Console: colour dummy device 80x25 Apr 17 23:44:22.142700 kernel: printk: console [tty1] enabled Apr 17 23:44:22.142716 kernel: printk: console [ttyS0] enabled Apr 17 23:44:22.142738 kernel: printk: bootconsole [earlyser0] disabled Apr 17 23:44:22.142751 kernel: ACPI: Core revision 20230628 Apr 17 23:44:22.142765 kernel: Failed to register legacy timer interrupt Apr 17 23:44:22.142778 kernel: APIC: Switch to symmetric I/O mode setup Apr 17 23:44:22.142792 kernel: Hyper-V: enabling crash_kexec_post_notifiers Apr 17 23:44:22.142812 kernel: Hyper-V: Using IPI hypercalls Apr 17 23:44:22.142828 kernel: APIC: send_IPI() replaced with hv_send_ipi() Apr 17 23:44:22.142840 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() Apr 17 23:44:22.142854 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() Apr 17 23:44:22.142871 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() Apr 17 23:44:22.142884 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() Apr 17 23:44:22.142914 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() Apr 17 23:44:22.142930 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 5187.81 BogoMIPS (lpj=2593905) Apr 17 23:44:22.142946 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Apr 17 23:44:22.142960 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Apr 17 23:44:22.142973 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Apr 17 23:44:22.142986 kernel: Spectre V2 : Mitigation: Retpolines Apr 17 23:44:22.142998 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Apr 17 23:44:22.143011 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Apr 17 23:44:22.143028 kernel: RETBleed: Vulnerable Apr 17 23:44:22.143041 kernel: Speculative Store Bypass: Vulnerable Apr 17 23:44:22.143054 kernel: TAA: Vulnerable: Clear CPU buffers attempted, no microcode Apr 17 23:44:22.143068 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Apr 17 23:44:22.143080 kernel: active return thunk: its_return_thunk Apr 17 23:44:22.143094 kernel: ITS: Mitigation: Aligned branch/return thunks Apr 17 23:44:22.143107 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Apr 17 23:44:22.143120 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Apr 17 23:44:22.143134 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Apr 17 23:44:22.143147 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Apr 17 23:44:22.143163 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Apr 17 23:44:22.143176 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Apr 17 23:44:22.143190 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Apr 17 23:44:22.143203 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Apr 17 23:44:22.143217 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Apr 17 23:44:22.143230 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Apr 17 23:44:22.143243 kernel: x86/fpu: Enabled xstate features 0xe7, context size is 2432 bytes, using 'compacted' format. Apr 17 23:44:22.143255 kernel: Freeing SMP alternatives memory: 32K Apr 17 23:44:22.143267 kernel: pid_max: default: 32768 minimum: 301 Apr 17 23:44:22.143281 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Apr 17 23:44:22.143296 kernel: landlock: Up and running. Apr 17 23:44:22.143310 kernel: SELinux: Initializing. Apr 17 23:44:22.143327 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Apr 17 23:44:22.143341 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Apr 17 23:44:22.143356 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8272CL CPU @ 2.60GHz (family: 0x6, model: 0x55, stepping: 0x7) Apr 17 23:44:22.143371 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 17 23:44:22.143386 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 17 23:44:22.143400 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 17 23:44:22.143416 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Apr 17 23:44:22.143428 kernel: signal: max sigframe size: 3632 Apr 17 23:44:22.143443 kernel: rcu: Hierarchical SRCU implementation. Apr 17 23:44:22.143460 kernel: rcu: Max phase no-delay instances is 400. Apr 17 23:44:22.143475 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Apr 17 23:44:22.143488 kernel: smp: Bringing up secondary CPUs ... Apr 17 23:44:22.143502 kernel: smpboot: x86: Booting SMP configuration: Apr 17 23:44:22.143518 kernel: .... node #0, CPUs: #1 Apr 17 23:44:22.143531 kernel: TAA CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/tsx_async_abort.html for more details. Apr 17 23:44:22.143546 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Apr 17 23:44:22.143560 kernel: smp: Brought up 1 node, 2 CPUs Apr 17 23:44:22.143573 kernel: smpboot: Max logical packages: 1 Apr 17 23:44:22.143590 kernel: smpboot: Total of 2 processors activated (10375.62 BogoMIPS) Apr 17 23:44:22.143604 kernel: devtmpfs: initialized Apr 17 23:44:22.143618 kernel: x86/mm: Memory block size: 128MB Apr 17 23:44:22.143633 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) Apr 17 23:44:22.143648 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Apr 17 23:44:22.143664 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Apr 17 23:44:22.143677 kernel: pinctrl core: initialized pinctrl subsystem Apr 17 23:44:22.143691 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Apr 17 23:44:22.143706 kernel: audit: initializing netlink subsys (disabled) Apr 17 23:44:22.143725 kernel: audit: type=2000 audit(1776469460.029:1): state=initialized audit_enabled=0 res=1 Apr 17 23:44:22.143740 kernel: thermal_sys: Registered thermal governor 'step_wise' Apr 17 23:44:22.143756 kernel: thermal_sys: Registered thermal governor 'user_space' Apr 17 23:44:22.143771 kernel: cpuidle: using governor menu Apr 17 23:44:22.143784 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Apr 17 23:44:22.143797 kernel: dca service started, version 1.12.1 Apr 17 23:44:22.143811 kernel: e820: reserve RAM buffer [mem 0x0437e000-0x07ffffff] Apr 17 23:44:22.143824 kernel: e820: reserve RAM buffer [mem 0x3ff1f000-0x3fffffff] Apr 17 23:44:22.143839 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Apr 17 23:44:22.143857 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Apr 17 23:44:22.143873 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Apr 17 23:44:22.143886 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Apr 17 23:44:22.143919 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Apr 17 23:44:22.143934 kernel: ACPI: Added _OSI(Module Device) Apr 17 23:44:22.143950 kernel: ACPI: Added _OSI(Processor Device) Apr 17 23:44:22.143966 kernel: ACPI: Added _OSI(Processor Aggregator Device) Apr 17 23:44:22.143984 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Apr 17 23:44:22.144004 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Apr 17 23:44:22.144018 kernel: ACPI: Interpreter enabled Apr 17 23:44:22.144031 kernel: ACPI: PM: (supports S0 S5) Apr 17 23:44:22.144046 kernel: ACPI: Using IOAPIC for interrupt routing Apr 17 23:44:22.144069 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Apr 17 23:44:22.144082 kernel: PCI: Ignoring E820 reservations for host bridge windows Apr 17 23:44:22.144096 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F Apr 17 23:44:22.144108 kernel: iommu: Default domain type: Translated Apr 17 23:44:22.144124 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Apr 17 23:44:22.144138 kernel: efivars: Registered efivars operations Apr 17 23:44:22.144157 kernel: PCI: Using ACPI for IRQ routing Apr 17 23:44:22.144172 kernel: PCI: System does not support PCI Apr 17 23:44:22.144187 kernel: vgaarb: loaded Apr 17 23:44:22.144202 kernel: clocksource: Switched to clocksource hyperv_clocksource_tsc_page Apr 17 23:44:22.144217 kernel: VFS: Disk quotas dquot_6.6.0 Apr 17 23:44:22.144232 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Apr 17 23:44:22.144247 kernel: pnp: PnP ACPI init Apr 17 23:44:22.144262 kernel: pnp: PnP ACPI: found 3 devices Apr 17 23:44:22.144277 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Apr 17 23:44:22.144294 kernel: NET: Registered PF_INET protocol family Apr 17 23:44:22.144307 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Apr 17 23:44:22.144321 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Apr 17 23:44:22.144335 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Apr 17 23:44:22.144350 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Apr 17 23:44:22.144366 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Apr 17 23:44:22.144381 kernel: TCP: Hash tables configured (established 65536 bind 65536) Apr 17 23:44:22.144396 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Apr 17 23:44:22.144411 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Apr 17 23:44:22.144426 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Apr 17 23:44:22.144439 kernel: NET: Registered PF_XDP protocol family Apr 17 23:44:22.144452 kernel: PCI: CLS 0 bytes, default 64 Apr 17 23:44:22.144467 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Apr 17 23:44:22.144479 kernel: software IO TLB: mapped [mem 0x000000003a878000-0x000000003e878000] (64MB) Apr 17 23:44:22.144493 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Apr 17 23:44:22.144508 kernel: Initialise system trusted keyrings Apr 17 23:44:22.144522 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Apr 17 23:44:22.144539 kernel: Key type asymmetric registered Apr 17 23:44:22.144553 kernel: Asymmetric key parser 'x509' registered Apr 17 23:44:22.144567 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Apr 17 23:44:22.144581 kernel: io scheduler mq-deadline registered Apr 17 23:44:22.144594 kernel: io scheduler kyber registered Apr 17 23:44:22.144608 kernel: io scheduler bfq registered Apr 17 23:44:22.144623 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Apr 17 23:44:22.144637 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Apr 17 23:44:22.144652 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Apr 17 23:44:22.144667 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Apr 17 23:44:22.144686 kernel: i8042: PNP: No PS/2 controller found. Apr 17 23:44:22.144887 kernel: rtc_cmos 00:02: registered as rtc0 Apr 17 23:44:22.150099 kernel: rtc_cmos 00:02: setting system clock to 2026-04-17T23:44:21 UTC (1776469461) Apr 17 23:44:22.150198 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram Apr 17 23:44:22.150213 kernel: intel_pstate: CPU model not supported Apr 17 23:44:22.150223 kernel: efifb: probing for efifb Apr 17 23:44:22.150232 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Apr 17 23:44:22.150248 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Apr 17 23:44:22.150257 kernel: efifb: scrolling: redraw Apr 17 23:44:22.150265 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Apr 17 23:44:22.150278 kernel: Console: switching to colour frame buffer device 128x48 Apr 17 23:44:22.150286 kernel: fb0: EFI VGA frame buffer device Apr 17 23:44:22.150298 kernel: pstore: Using crash dump compression: deflate Apr 17 23:44:22.150307 kernel: pstore: Registered efi_pstore as persistent store backend Apr 17 23:44:22.150315 kernel: NET: Registered PF_INET6 protocol family Apr 17 23:44:22.150326 kernel: Segment Routing with IPv6 Apr 17 23:44:22.150339 kernel: In-situ OAM (IOAM) with IPv6 Apr 17 23:44:22.150347 kernel: NET: Registered PF_PACKET protocol family Apr 17 23:44:22.150360 kernel: Key type dns_resolver registered Apr 17 23:44:22.150368 kernel: IPI shorthand broadcast: enabled Apr 17 23:44:22.150376 kernel: sched_clock: Marking stable (897003300, 52245900)->(1185621200, -236372000) Apr 17 23:44:22.150388 kernel: registered taskstats version 1 Apr 17 23:44:22.150397 kernel: Loading compiled-in X.509 certificates Apr 17 23:44:22.150405 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: 39e9969c7f49062f0fc1d1fb72e8f874436eb94f' Apr 17 23:44:22.150417 kernel: Key type .fscrypt registered Apr 17 23:44:22.150428 kernel: Key type fscrypt-provisioning registered Apr 17 23:44:22.150440 kernel: ima: No TPM chip found, activating TPM-bypass! Apr 17 23:44:22.150448 kernel: ima: Allocated hash algorithm: sha1 Apr 17 23:44:22.150460 kernel: ima: No architecture policies found Apr 17 23:44:22.150469 kernel: clk: Disabling unused clocks Apr 17 23:44:22.150477 kernel: Freeing unused kernel image (initmem) memory: 42892K Apr 17 23:44:22.150488 kernel: Write protecting the kernel read-only data: 36864k Apr 17 23:44:22.150498 kernel: Freeing unused kernel image (rodata/data gap) memory: 1824K Apr 17 23:44:22.150506 kernel: Run /init as init process Apr 17 23:44:22.150521 kernel: with arguments: Apr 17 23:44:22.150529 kernel: /init Apr 17 23:44:22.150537 kernel: with environment: Apr 17 23:44:22.150549 kernel: HOME=/ Apr 17 23:44:22.150557 kernel: TERM=linux Apr 17 23:44:22.150568 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 17 23:44:22.150583 systemd[1]: Detected virtualization microsoft. Apr 17 23:44:22.150591 systemd[1]: Detected architecture x86-64. Apr 17 23:44:22.150606 systemd[1]: Running in initrd. Apr 17 23:44:22.150615 systemd[1]: No hostname configured, using default hostname. Apr 17 23:44:22.150623 systemd[1]: Hostname set to . Apr 17 23:44:22.150637 systemd[1]: Initializing machine ID from random generator. Apr 17 23:44:22.150645 systemd[1]: Queued start job for default target initrd.target. Apr 17 23:44:22.150654 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 17 23:44:22.150667 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 17 23:44:22.150676 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Apr 17 23:44:22.150691 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 17 23:44:22.150700 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Apr 17 23:44:22.150709 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Apr 17 23:44:22.150724 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Apr 17 23:44:22.150733 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Apr 17 23:44:22.150745 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 17 23:44:22.150754 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 17 23:44:22.150766 systemd[1]: Reached target paths.target - Path Units. Apr 17 23:44:22.150776 systemd[1]: Reached target slices.target - Slice Units. Apr 17 23:44:22.150787 systemd[1]: Reached target swap.target - Swaps. Apr 17 23:44:22.150796 systemd[1]: Reached target timers.target - Timer Units. Apr 17 23:44:22.150808 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Apr 17 23:44:22.150817 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 17 23:44:22.150826 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Apr 17 23:44:22.150837 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Apr 17 23:44:22.150848 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 17 23:44:22.150859 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 17 23:44:22.150867 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 17 23:44:22.150876 systemd[1]: Reached target sockets.target - Socket Units. Apr 17 23:44:22.150885 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Apr 17 23:44:22.150902 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 17 23:44:22.150910 systemd[1]: Finished network-cleanup.service - Network Cleanup. Apr 17 23:44:22.150919 systemd[1]: Starting systemd-fsck-usr.service... Apr 17 23:44:22.150928 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 17 23:44:22.150939 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 17 23:44:22.150969 systemd-journald[177]: Collecting audit messages is disabled. Apr 17 23:44:22.150994 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 23:44:22.151005 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Apr 17 23:44:22.151020 systemd-journald[177]: Journal started Apr 17 23:44:22.151043 systemd-journald[177]: Runtime Journal (/run/log/journal/173f793e28074cde90a83b4d3406971a) is 8.0M, max 158.7M, 150.7M free. Apr 17 23:44:22.163336 systemd[1]: Started systemd-journald.service - Journal Service. Apr 17 23:44:22.168406 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 17 23:44:22.176854 systemd[1]: Finished systemd-fsck-usr.service. Apr 17 23:44:22.179595 systemd-modules-load[178]: Inserted module 'overlay' Apr 17 23:44:22.185790 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:22.204138 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 17 23:44:22.213524 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 17 23:44:22.217975 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 17 23:44:22.242733 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Apr 17 23:44:22.248345 kernel: Bridge firewalling registered Apr 17 23:44:22.248156 systemd-modules-load[178]: Inserted module 'br_netfilter' Apr 17 23:44:22.248814 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 17 23:44:22.251524 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 17 23:44:22.266784 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 17 23:44:22.274430 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 17 23:44:22.283160 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 17 23:44:22.287164 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 17 23:44:22.302118 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Apr 17 23:44:22.313115 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 17 23:44:22.319585 dracut-cmdline[207]: dracut-dracut-053 Apr 17 23:44:22.319585 dracut-cmdline[207]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=e69cfa144bf8cf6f0b7e7881c91c17228ba9dbcb6c99d9692bced9ddba34ee3a Apr 17 23:44:22.354415 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 17 23:44:22.369051 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 17 23:44:22.413915 kernel: SCSI subsystem initialized Apr 17 23:44:22.416037 systemd-resolved[265]: Positive Trust Anchors: Apr 17 23:44:22.416057 systemd-resolved[265]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 17 23:44:22.416113 systemd-resolved[265]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 17 23:44:22.454241 kernel: Loading iSCSI transport class v2.0-870. Apr 17 23:44:22.445380 systemd-resolved[265]: Defaulting to hostname 'linux'. Apr 17 23:44:22.446733 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 17 23:44:22.450611 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 17 23:44:22.466911 kernel: iscsi: registered transport (tcp) Apr 17 23:44:22.489621 kernel: iscsi: registered transport (qla4xxx) Apr 17 23:44:22.489699 kernel: QLogic iSCSI HBA Driver Apr 17 23:44:22.526468 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Apr 17 23:44:22.542073 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Apr 17 23:44:22.571806 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Apr 17 23:44:22.571917 kernel: device-mapper: uevent: version 1.0.3 Apr 17 23:44:22.575820 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Apr 17 23:44:22.617922 kernel: raid6: avx512x4 gen() 18208 MB/s Apr 17 23:44:22.636901 kernel: raid6: avx512x2 gen() 18174 MB/s Apr 17 23:44:22.655900 kernel: raid6: avx512x1 gen() 18260 MB/s Apr 17 23:44:22.675909 kernel: raid6: avx2x4 gen() 18187 MB/s Apr 17 23:44:22.694901 kernel: raid6: avx2x2 gen() 18209 MB/s Apr 17 23:44:22.715602 kernel: raid6: avx2x1 gen() 13690 MB/s Apr 17 23:44:22.715632 kernel: raid6: using algorithm avx512x1 gen() 18260 MB/s Apr 17 23:44:22.738059 kernel: raid6: .... xor() 26904 MB/s, rmw enabled Apr 17 23:44:22.738089 kernel: raid6: using avx512x2 recovery algorithm Apr 17 23:44:22.759918 kernel: xor: automatically using best checksumming function avx Apr 17 23:44:22.907919 kernel: Btrfs loaded, zoned=no, fsverity=no Apr 17 23:44:22.917305 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Apr 17 23:44:22.928155 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 17 23:44:22.946938 systemd-udevd[396]: Using default interface naming scheme 'v255'. Apr 17 23:44:22.951553 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 17 23:44:22.966100 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Apr 17 23:44:22.983469 dracut-pre-trigger[407]: rd.md=0: removing MD RAID activation Apr 17 23:44:23.011059 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Apr 17 23:44:23.026091 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 17 23:44:23.070049 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 17 23:44:23.082126 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Apr 17 23:44:23.114156 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Apr 17 23:44:23.122964 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Apr 17 23:44:23.135093 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 17 23:44:23.142218 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 17 23:44:23.153941 kernel: cryptd: max_cpu_qlen set to 1000 Apr 17 23:44:23.156316 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Apr 17 23:44:23.185388 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Apr 17 23:44:23.202910 kernel: hv_vmbus: Vmbus version:5.2 Apr 17 23:44:23.202971 kernel: AVX2 version of gcm_enc/dec engaged. Apr 17 23:44:23.207158 kernel: AES CTR mode by8 optimization enabled Apr 17 23:44:23.214229 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 17 23:44:23.214364 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 17 23:44:23.226169 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 17 23:44:23.246160 kernel: pps_core: LinuxPPS API ver. 1 registered Apr 17 23:44:23.246188 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Apr 17 23:44:23.238263 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 17 23:44:23.238456 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:23.246117 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 23:44:23.265241 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 23:44:23.291630 kernel: PTP clock support registered Apr 17 23:44:23.296158 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 17 23:44:23.305491 kernel: hv_vmbus: registering driver hyperv_keyboard Apr 17 23:44:23.297056 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:23.314989 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Apr 17 23:44:23.320142 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 23:44:23.332905 kernel: hid: raw HID events driver (C) Jiri Kosina Apr 17 23:44:23.339049 kernel: hv_utils: Registering HyperV Utility Driver Apr 17 23:44:23.339080 kernel: hv_vmbus: registering driver hv_utils Apr 17 23:44:23.342600 kernel: hv_vmbus: registering driver hv_storvsc Apr 17 23:44:23.342629 kernel: hv_vmbus: registering driver hv_netvsc Apr 17 23:44:23.355916 kernel: scsi host0: storvsc_host_t Apr 17 23:44:23.355969 kernel: scsi host1: storvsc_host_t Apr 17 23:44:23.361886 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Apr 17 23:44:23.364455 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:23.375011 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Apr 17 23:44:23.382020 kernel: hv_utils: Shutdown IC version 3.2 Apr 17 23:44:23.382060 kernel: hv_utils: TimeSync IC version 4.0 Apr 17 23:44:23.384396 kernel: hv_utils: Heartbeat IC version 3.0 Apr 17 23:44:23.384424 kernel: hv_vmbus: registering driver hid_hyperv Apr 17 23:44:23.972977 systemd-resolved[265]: Clock change detected. Flushing caches. Apr 17 23:44:23.980527 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Apr 17 23:44:23.980718 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 17 23:44:23.991582 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Apr 17 23:44:24.013016 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Apr 17 23:44:24.013216 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Apr 17 23:44:24.016496 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Apr 17 23:44:24.033419 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Apr 17 23:44:24.033746 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Apr 17 23:44:24.031370 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 17 23:44:24.043047 kernel: sd 0:0:0:0: [sda] Write Protect is off Apr 17 23:44:24.043220 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Apr 17 23:44:24.043327 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Apr 17 23:44:24.052052 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 17 23:44:24.052093 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Apr 17 23:44:24.062503 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#210 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Apr 17 23:44:24.062670 kernel: hv_netvsc 000d3ada-bb1d-000d-3ada-bb1d000d3ada eth0: VF slot 1 added Apr 17 23:44:24.076743 kernel: hv_vmbus: registering driver hv_pci Apr 17 23:44:24.076797 kernel: hv_pci 158243a4-d954-4bec-bd98-a32beb36bf61: PCI VMBus probing: Using version 0x10004 Apr 17 23:44:24.086493 kernel: hv_pci 158243a4-d954-4bec-bd98-a32beb36bf61: PCI host bridge to bus d954:00 Apr 17 23:44:24.095347 kernel: pci_bus d954:00: root bus resource [mem 0xfe0000000-0xfe00fffff window] Apr 17 23:44:24.095605 kernel: pci_bus d954:00: No busn resource found for root bus, will use [bus 00-ff] Apr 17 23:44:24.108545 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#215 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Apr 17 23:44:24.108720 kernel: pci d954:00:02.0: [15b3:1016] type 00 class 0x020000 Apr 17 23:44:24.113526 kernel: pci d954:00:02.0: reg 0x10: [mem 0xfe0000000-0xfe00fffff 64bit pref] Apr 17 23:44:24.118565 kernel: pci d954:00:02.0: enabling Extended Tags Apr 17 23:44:24.131515 kernel: pci d954:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at d954:00:02.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) Apr 17 23:44:24.138453 kernel: pci_bus d954:00: busn_res: [bus 00-ff] end is updated to 00 Apr 17 23:44:24.138753 kernel: pci d954:00:02.0: BAR 0: assigned [mem 0xfe0000000-0xfe00fffff 64bit pref] Apr 17 23:44:24.304039 kernel: mlx5_core d954:00:02.0: enabling device (0000 -> 0002) Apr 17 23:44:24.308502 kernel: mlx5_core d954:00:02.0: firmware version: 14.30.5026 Apr 17 23:44:24.518457 kernel: hv_netvsc 000d3ada-bb1d-000d-3ada-bb1d000d3ada eth0: VF registering: eth1 Apr 17 23:44:24.518826 kernel: mlx5_core d954:00:02.0 eth1: joined to eth0 Apr 17 23:44:24.524542 kernel: mlx5_core d954:00:02.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Apr 17 23:44:24.534503 kernel: mlx5_core d954:00:02.0 enP55636s1: renamed from eth1 Apr 17 23:44:24.565504 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by (udev-worker) (441) Apr 17 23:44:24.590280 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Apr 17 23:44:24.602777 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Apr 17 23:44:24.647742 kernel: BTRFS: device fsid 81b0bf8a-1550-4880-b72f-76fa51dbb6c0 devid 1 transid 32 /dev/sda3 scanned by (udev-worker) (442) Apr 17 23:44:24.661276 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Apr 17 23:44:24.678048 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Apr 17 23:44:24.681957 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Apr 17 23:44:24.696665 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Apr 17 23:44:24.717539 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 17 23:44:24.725494 kernel: GPT:disk_guids don't match. Apr 17 23:44:24.725537 kernel: GPT: Use GNU Parted to correct GPT errors. Apr 17 23:44:24.725551 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 17 23:44:24.737498 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 17 23:44:25.738501 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 17 23:44:25.739046 disk-uuid[608]: The operation has completed successfully. Apr 17 23:44:25.814659 systemd[1]: disk-uuid.service: Deactivated successfully. Apr 17 23:44:25.814777 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Apr 17 23:44:25.842666 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Apr 17 23:44:25.850130 sh[721]: Success Apr 17 23:44:25.879518 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Apr 17 23:44:26.138062 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Apr 17 23:44:26.151627 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Apr 17 23:44:26.157237 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Apr 17 23:44:26.176226 kernel: BTRFS info (device dm-0): first mount of filesystem 81b0bf8a-1550-4880-b72f-76fa51dbb6c0 Apr 17 23:44:26.176299 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Apr 17 23:44:26.180345 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Apr 17 23:44:26.184291 kernel: BTRFS info (device dm-0): disabling log replay at mount time Apr 17 23:44:26.187631 kernel: BTRFS info (device dm-0): using free space tree Apr 17 23:44:26.436880 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Apr 17 23:44:26.437807 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Apr 17 23:44:26.450741 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Apr 17 23:44:26.459437 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Apr 17 23:44:26.487446 kernel: BTRFS info (device sda6): first mount of filesystem a5a0fe13-59ac-4c21-ab23-7fd1bfa02f60 Apr 17 23:44:26.487524 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Apr 17 23:44:26.487548 kernel: BTRFS info (device sda6): using free space tree Apr 17 23:44:26.521507 kernel: BTRFS info (device sda6): auto enabling async discard Apr 17 23:44:26.539149 kernel: BTRFS info (device sda6): last unmount of filesystem a5a0fe13-59ac-4c21-ab23-7fd1bfa02f60 Apr 17 23:44:26.538759 systemd[1]: mnt-oem.mount: Deactivated successfully. Apr 17 23:44:26.552904 systemd[1]: Finished ignition-setup.service - Ignition (setup). Apr 17 23:44:26.561663 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Apr 17 23:44:26.569498 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 17 23:44:26.577639 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 17 23:44:26.601277 systemd-networkd[905]: lo: Link UP Apr 17 23:44:26.601289 systemd-networkd[905]: lo: Gained carrier Apr 17 23:44:26.603503 systemd-networkd[905]: Enumeration completed Apr 17 23:44:26.603613 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 17 23:44:26.604161 systemd[1]: Reached target network.target - Network. Apr 17 23:44:26.606028 systemd-networkd[905]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 17 23:44:26.606033 systemd-networkd[905]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 17 23:44:26.681502 kernel: mlx5_core d954:00:02.0 enP55636s1: Link up Apr 17 23:44:26.712505 kernel: hv_netvsc 000d3ada-bb1d-000d-3ada-bb1d000d3ada eth0: Data path switched to VF: enP55636s1 Apr 17 23:44:26.712741 systemd-networkd[905]: enP55636s1: Link UP Apr 17 23:44:26.712868 systemd-networkd[905]: eth0: Link UP Apr 17 23:44:26.713034 systemd-networkd[905]: eth0: Gained carrier Apr 17 23:44:26.713047 systemd-networkd[905]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 17 23:44:26.717706 systemd-networkd[905]: enP55636s1: Gained carrier Apr 17 23:44:26.757904 systemd-networkd[905]: eth0: DHCPv4 address 10.0.0.10/24, gateway 10.0.0.1 acquired from 168.63.129.16 Apr 17 23:44:27.569549 ignition[903]: Ignition 2.19.0 Apr 17 23:44:27.569562 ignition[903]: Stage: fetch-offline Apr 17 23:44:27.569619 ignition[903]: no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:27.573871 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Apr 17 23:44:27.569631 ignition[903]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:27.569770 ignition[903]: parsed url from cmdline: "" Apr 17 23:44:27.569775 ignition[903]: no config URL provided Apr 17 23:44:27.569782 ignition[903]: reading system config file "/usr/lib/ignition/user.ign" Apr 17 23:44:27.569794 ignition[903]: no config at "/usr/lib/ignition/user.ign" Apr 17 23:44:27.569800 ignition[903]: failed to fetch config: resource requires networking Apr 17 23:44:27.571882 ignition[903]: Ignition finished successfully Apr 17 23:44:27.600770 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Apr 17 23:44:27.618357 ignition[913]: Ignition 2.19.0 Apr 17 23:44:27.618371 ignition[913]: Stage: fetch Apr 17 23:44:27.618626 ignition[913]: no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:27.618640 ignition[913]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:27.620826 ignition[913]: parsed url from cmdline: "" Apr 17 23:44:27.620832 ignition[913]: no config URL provided Apr 17 23:44:27.620840 ignition[913]: reading system config file "/usr/lib/ignition/user.ign" Apr 17 23:44:27.620853 ignition[913]: no config at "/usr/lib/ignition/user.ign" Apr 17 23:44:27.620880 ignition[913]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Apr 17 23:44:27.686689 ignition[913]: GET result: OK Apr 17 23:44:27.686781 ignition[913]: config has been read from IMDS userdata Apr 17 23:44:27.686814 ignition[913]: parsing config with SHA512: 103ab94cbf6e3c53b6af9eda365f9c901f8901394d3dca1ae5a5a7ff1bf7aeb727897437c1fb92c5664f1c06d3dd11a75077c18da4ffc980d15d233748fdc04f Apr 17 23:44:27.691001 unknown[913]: fetched base config from "system" Apr 17 23:44:27.691444 ignition[913]: fetch: fetch complete Apr 17 23:44:27.691013 unknown[913]: fetched base config from "system" Apr 17 23:44:27.691449 ignition[913]: fetch: fetch passed Apr 17 23:44:27.691018 unknown[913]: fetched user config from "azure" Apr 17 23:44:27.691517 ignition[913]: Ignition finished successfully Apr 17 23:44:27.706659 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Apr 17 23:44:27.716661 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Apr 17 23:44:27.733670 ignition[919]: Ignition 2.19.0 Apr 17 23:44:27.733684 ignition[919]: Stage: kargs Apr 17 23:44:27.733916 ignition[919]: no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:27.737726 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Apr 17 23:44:27.733929 ignition[919]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:27.735282 ignition[919]: kargs: kargs passed Apr 17 23:44:27.751224 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Apr 17 23:44:27.735335 ignition[919]: Ignition finished successfully Apr 17 23:44:27.766626 ignition[925]: Ignition 2.19.0 Apr 17 23:44:27.766638 ignition[925]: Stage: disks Apr 17 23:44:27.766850 ignition[925]: no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:27.769340 systemd[1]: Finished ignition-disks.service - Ignition (disks). Apr 17 23:44:27.766863 ignition[925]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:27.771868 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Apr 17 23:44:27.767820 ignition[925]: disks: disks passed Apr 17 23:44:27.772513 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Apr 17 23:44:27.767862 ignition[925]: Ignition finished successfully Apr 17 23:44:27.772974 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 17 23:44:27.773458 systemd[1]: Reached target sysinit.target - System Initialization. Apr 17 23:44:27.774453 systemd[1]: Reached target basic.target - Basic System. Apr 17 23:44:27.799713 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Apr 17 23:44:27.896784 systemd-fsck[933]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks Apr 17 23:44:27.904158 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Apr 17 23:44:27.915668 systemd[1]: Mounting sysroot.mount - /sysroot... Apr 17 23:44:28.009502 kernel: EXT4-fs (sda9): mounted filesystem d3c199f8-8065-4f33-a75b-da2f09d4fc39 r/w with ordered data mode. Quota mode: none. Apr 17 23:44:28.010372 systemd[1]: Mounted sysroot.mount - /sysroot. Apr 17 23:44:28.013672 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Apr 17 23:44:28.057570 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 17 23:44:28.080856 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (944) Apr 17 23:44:28.080932 kernel: BTRFS info (device sda6): first mount of filesystem a5a0fe13-59ac-4c21-ab23-7fd1bfa02f60 Apr 17 23:44:28.080586 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Apr 17 23:44:28.089338 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Apr 17 23:44:28.089367 kernel: BTRFS info (device sda6): using free space tree Apr 17 23:44:28.098661 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Apr 17 23:44:28.109109 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Apr 17 23:44:28.116007 kernel: BTRFS info (device sda6): auto enabling async discard Apr 17 23:44:28.109158 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Apr 17 23:44:28.124404 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Apr 17 23:44:28.130158 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 17 23:44:28.144626 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Apr 17 23:44:28.371758 systemd-networkd[905]: eth0: Gained IPv6LL Apr 17 23:44:28.678791 coreos-metadata[948]: Apr 17 23:44:28.678 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Apr 17 23:44:28.685293 coreos-metadata[948]: Apr 17 23:44:28.685 INFO Fetch successful Apr 17 23:44:28.688606 coreos-metadata[948]: Apr 17 23:44:28.685 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Apr 17 23:44:28.705494 coreos-metadata[948]: Apr 17 23:44:28.705 INFO Fetch successful Apr 17 23:44:28.720546 coreos-metadata[948]: Apr 17 23:44:28.720 INFO wrote hostname ci-4081.3.6-n-7b570e9a3c to /sysroot/etc/hostname Apr 17 23:44:28.730710 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 17 23:44:29.141612 initrd-setup-root[973]: cut: /sysroot/etc/passwd: No such file or directory Apr 17 23:44:29.165200 initrd-setup-root[980]: cut: /sysroot/etc/group: No such file or directory Apr 17 23:44:29.170718 initrd-setup-root[987]: cut: /sysroot/etc/shadow: No such file or directory Apr 17 23:44:29.178287 initrd-setup-root[994]: cut: /sysroot/etc/gshadow: No such file or directory Apr 17 23:44:30.130899 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Apr 17 23:44:30.140717 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Apr 17 23:44:30.149667 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Apr 17 23:44:30.156837 kernel: BTRFS info (device sda6): last unmount of filesystem a5a0fe13-59ac-4c21-ab23-7fd1bfa02f60 Apr 17 23:44:30.161248 systemd[1]: sysroot-oem.mount: Deactivated successfully. Apr 17 23:44:30.189968 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Apr 17 23:44:30.192992 ignition[1061]: INFO : Ignition 2.19.0 Apr 17 23:44:30.192992 ignition[1061]: INFO : Stage: mount Apr 17 23:44:30.192992 ignition[1061]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:30.192992 ignition[1061]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:30.210829 ignition[1061]: INFO : mount: mount passed Apr 17 23:44:30.210829 ignition[1061]: INFO : Ignition finished successfully Apr 17 23:44:30.198967 systemd[1]: Finished ignition-mount.service - Ignition (mount). Apr 17 23:44:30.216643 systemd[1]: Starting ignition-files.service - Ignition (files)... Apr 17 23:44:30.235679 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 17 23:44:30.252498 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 scanned by mount (1073) Apr 17 23:44:30.259426 kernel: BTRFS info (device sda6): first mount of filesystem a5a0fe13-59ac-4c21-ab23-7fd1bfa02f60 Apr 17 23:44:30.259475 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Apr 17 23:44:30.262127 kernel: BTRFS info (device sda6): using free space tree Apr 17 23:44:30.268499 kernel: BTRFS info (device sda6): auto enabling async discard Apr 17 23:44:30.270512 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 17 23:44:30.301779 ignition[1090]: INFO : Ignition 2.19.0 Apr 17 23:44:30.301779 ignition[1090]: INFO : Stage: files Apr 17 23:44:30.306604 ignition[1090]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:30.306604 ignition[1090]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:30.306604 ignition[1090]: DEBUG : files: compiled without relabeling support, skipping Apr 17 23:44:30.317831 ignition[1090]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Apr 17 23:44:30.317831 ignition[1090]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Apr 17 23:44:30.460829 ignition[1090]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Apr 17 23:44:30.465188 ignition[1090]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Apr 17 23:44:30.469453 unknown[1090]: wrote ssh authorized keys file for user: core Apr 17 23:44:30.472773 ignition[1090]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Apr 17 23:44:30.472773 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Apr 17 23:44:30.472773 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Apr 17 23:44:30.472773 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Apr 17 23:44:30.472773 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Apr 17 23:44:30.796438 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Apr 17 23:44:30.915880 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Apr 17 23:44:30.921900 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/install.sh" Apr 17 23:44:30.927513 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/install.sh" Apr 17 23:44:30.927513 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nginx.yaml" Apr 17 23:44:30.938739 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nginx.yaml" Apr 17 23:44:30.944010 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 17 23:44:30.949625 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 17 23:44:30.955210 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/flatcar/update.conf" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/flatcar/update.conf" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-x86-64.raw: attempt #1 Apr 17 23:44:31.264559 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET result: OK Apr 17 23:44:31.618355 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Apr 17 23:44:31.618355 ignition[1090]: INFO : files: op(c): [started] processing unit "containerd.service" Apr 17 23:44:31.640780 ignition[1090]: INFO : files: op(c): op(d): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(c): op(d): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(c): [finished] processing unit "containerd.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(e): [started] processing unit "prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(e): op(f): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(e): op(f): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(e): [finished] processing unit "prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(10): [started] setting preset to enabled for "prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(10): [finished] setting preset to enabled for "prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: createResultFile: createFiles: op(11): [started] writing file "/sysroot/etc/.ignition-result.json" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: createResultFile: createFiles: op(11): [finished] writing file "/sysroot/etc/.ignition-result.json" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: files passed Apr 17 23:44:31.652094 ignition[1090]: INFO : Ignition finished successfully Apr 17 23:44:31.646366 systemd[1]: Finished ignition-files.service - Ignition (files). Apr 17 23:44:31.667708 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Apr 17 23:44:31.681070 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Apr 17 23:44:31.691230 systemd[1]: ignition-quench.service: Deactivated successfully. Apr 17 23:44:31.691332 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Apr 17 23:44:31.742805 initrd-setup-root-after-ignition[1118]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 17 23:44:31.742805 initrd-setup-root-after-ignition[1118]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Apr 17 23:44:31.753064 initrd-setup-root-after-ignition[1122]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 17 23:44:31.754949 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 17 23:44:31.762696 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Apr 17 23:44:31.776732 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Apr 17 23:44:31.808685 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Apr 17 23:44:31.808811 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Apr 17 23:44:31.815653 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Apr 17 23:44:31.822285 systemd[1]: Reached target initrd.target - Initrd Default Target. Apr 17 23:44:31.825693 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Apr 17 23:44:31.837647 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Apr 17 23:44:31.853567 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 17 23:44:31.864671 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Apr 17 23:44:31.879018 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Apr 17 23:44:31.886511 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 17 23:44:31.890347 systemd[1]: Stopped target timers.target - Timer Units. Apr 17 23:44:31.896834 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Apr 17 23:44:31.897019 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 17 23:44:31.903813 systemd[1]: Stopped target initrd.target - Initrd Default Target. Apr 17 23:44:31.907113 systemd[1]: Stopped target basic.target - Basic System. Apr 17 23:44:31.913008 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Apr 17 23:44:31.928535 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Apr 17 23:44:31.935502 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Apr 17 23:44:31.935697 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Apr 17 23:44:31.936203 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Apr 17 23:44:31.936736 systemd[1]: Stopped target sysinit.target - System Initialization. Apr 17 23:44:31.937326 systemd[1]: Stopped target local-fs.target - Local File Systems. Apr 17 23:44:31.938366 systemd[1]: Stopped target swap.target - Swaps. Apr 17 23:44:31.938825 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Apr 17 23:44:31.938981 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Apr 17 23:44:31.939901 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Apr 17 23:44:31.940525 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 17 23:44:31.940971 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Apr 17 23:44:31.967698 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 17 23:44:31.981677 systemd[1]: dracut-initqueue.service: Deactivated successfully. Apr 17 23:44:31.981865 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Apr 17 23:44:32.016728 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Apr 17 23:44:32.020596 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 17 23:44:32.020852 systemd[1]: ignition-files.service: Deactivated successfully. Apr 17 23:44:32.020963 systemd[1]: Stopped ignition-files.service - Ignition (files). Apr 17 23:44:32.021318 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Apr 17 23:44:32.021415 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 17 23:44:32.048575 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Apr 17 23:44:32.057635 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Apr 17 23:44:32.063701 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Apr 17 23:44:32.071728 ignition[1142]: INFO : Ignition 2.19.0 Apr 17 23:44:32.071728 ignition[1142]: INFO : Stage: umount Apr 17 23:44:32.071728 ignition[1142]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:32.071728 ignition[1142]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:32.063908 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Apr 17 23:44:32.071898 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Apr 17 23:44:32.074219 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Apr 17 23:44:32.090345 ignition[1142]: INFO : umount: umount passed Apr 17 23:44:32.090345 ignition[1142]: INFO : Ignition finished successfully Apr 17 23:44:32.106213 systemd[1]: sysroot-boot.mount: Deactivated successfully. Apr 17 23:44:32.110225 systemd[1]: ignition-mount.service: Deactivated successfully. Apr 17 23:44:32.113296 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Apr 17 23:44:32.121884 systemd[1]: initrd-cleanup.service: Deactivated successfully. Apr 17 23:44:32.122006 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Apr 17 23:44:32.129855 systemd[1]: ignition-disks.service: Deactivated successfully. Apr 17 23:44:32.129994 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Apr 17 23:44:32.135718 systemd[1]: ignition-kargs.service: Deactivated successfully. Apr 17 23:44:32.135794 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Apr 17 23:44:32.149797 systemd[1]: ignition-fetch.service: Deactivated successfully. Apr 17 23:44:32.149876 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Apr 17 23:44:32.158576 systemd[1]: Stopped target network.target - Network. Apr 17 23:44:32.163929 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Apr 17 23:44:32.164022 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Apr 17 23:44:32.174370 systemd[1]: Stopped target paths.target - Path Units. Apr 17 23:44:32.174476 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Apr 17 23:44:32.179803 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 17 23:44:32.183690 systemd[1]: Stopped target slices.target - Slice Units. Apr 17 23:44:32.183833 systemd[1]: Stopped target sockets.target - Socket Units. Apr 17 23:44:32.184358 systemd[1]: iscsid.socket: Deactivated successfully. Apr 17 23:44:32.184405 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Apr 17 23:44:32.185381 systemd[1]: iscsiuio.socket: Deactivated successfully. Apr 17 23:44:32.185416 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 17 23:44:32.185874 systemd[1]: ignition-setup.service: Deactivated successfully. Apr 17 23:44:32.185923 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Apr 17 23:44:32.186394 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Apr 17 23:44:32.186430 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Apr 17 23:44:32.187056 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Apr 17 23:44:32.187451 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Apr 17 23:44:32.218574 systemd-networkd[905]: eth0: DHCPv6 lease lost Apr 17 23:44:32.221936 systemd[1]: systemd-networkd.service: Deactivated successfully. Apr 17 23:44:32.222070 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Apr 17 23:44:32.234324 systemd[1]: systemd-resolved.service: Deactivated successfully. Apr 17 23:44:32.234435 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Apr 17 23:44:32.243075 systemd[1]: systemd-networkd.socket: Deactivated successfully. Apr 17 23:44:32.243150 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Apr 17 23:44:32.281074 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Apr 17 23:44:32.285087 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Apr 17 23:44:32.285186 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 17 23:44:32.292737 systemd[1]: systemd-sysctl.service: Deactivated successfully. Apr 17 23:44:32.292806 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Apr 17 23:44:32.299195 systemd[1]: systemd-modules-load.service: Deactivated successfully. Apr 17 23:44:32.299262 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Apr 17 23:44:32.299389 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Apr 17 23:44:32.299431 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 17 23:44:32.300508 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 17 23:44:32.329493 systemd[1]: systemd-udevd.service: Deactivated successfully. Apr 17 23:44:32.329665 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 17 23:44:32.337037 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Apr 17 23:44:32.337138 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Apr 17 23:44:32.364567 kernel: hv_netvsc 000d3ada-bb1d-000d-3ada-bb1d000d3ada eth0: Data path switched from VF: enP55636s1 Apr 17 23:44:32.365866 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Apr 17 23:44:32.365931 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Apr 17 23:44:32.374969 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Apr 17 23:44:32.375056 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Apr 17 23:44:32.384591 systemd[1]: dracut-cmdline.service: Deactivated successfully. Apr 17 23:44:32.384675 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Apr 17 23:44:32.393647 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 17 23:44:32.393716 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 17 23:44:32.414651 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Apr 17 23:44:32.421620 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Apr 17 23:44:32.421702 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 17 23:44:32.428906 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Apr 17 23:44:32.436425 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 17 23:44:32.448208 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Apr 17 23:44:32.448284 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Apr 17 23:44:32.455528 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 17 23:44:32.455591 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:32.469887 systemd[1]: network-cleanup.service: Deactivated successfully. Apr 17 23:44:32.472773 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Apr 17 23:44:32.479074 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Apr 17 23:44:32.479192 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Apr 17 23:44:32.592303 systemd[1]: sysroot-boot.service: Deactivated successfully. Apr 17 23:44:32.592434 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Apr 17 23:44:32.594067 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Apr 17 23:44:32.594541 systemd[1]: initrd-setup-root.service: Deactivated successfully. Apr 17 23:44:32.594593 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Apr 17 23:44:32.609740 systemd[1]: Starting initrd-switch-root.service - Switch Root... Apr 17 23:44:33.089613 systemd[1]: Switching root. Apr 17 23:44:33.123824 systemd-journald[177]: Journal stopped Apr 17 23:44:22.141075 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Apr 17 22:11:20 -00 2026 Apr 17 23:44:22.141103 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=e69cfa144bf8cf6f0b7e7881c91c17228ba9dbcb6c99d9692bced9ddba34ee3a Apr 17 23:44:22.141118 kernel: BIOS-provided physical RAM map: Apr 17 23:44:22.141125 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Apr 17 23:44:22.141131 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved Apr 17 23:44:22.141137 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000000437dfff] usable Apr 17 23:44:22.141148 kernel: BIOS-e820: [mem 0x000000000437e000-0x000000000477dfff] reserved Apr 17 23:44:22.141155 kernel: BIOS-e820: [mem 0x000000000477e000-0x000000003ff1efff] usable Apr 17 23:44:22.141163 kernel: BIOS-e820: [mem 0x000000003ff1f000-0x000000003ff73fff] type 20 Apr 17 23:44:22.141174 kernel: BIOS-e820: [mem 0x000000003ff74000-0x000000003ffc8fff] reserved Apr 17 23:44:22.141181 kernel: BIOS-e820: [mem 0x000000003ffc9000-0x000000003fffafff] ACPI data Apr 17 23:44:22.141187 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS Apr 17 23:44:22.141195 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable Apr 17 23:44:22.141204 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable Apr 17 23:44:22.141214 kernel: printk: bootconsole [earlyser0] enabled Apr 17 23:44:22.141223 kernel: NX (Execute Disable) protection: active Apr 17 23:44:22.141232 kernel: APIC: Static calls initialized Apr 17 23:44:22.141239 kernel: efi: EFI v2.7 by Microsoft Apr 17 23:44:22.141246 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff88000 SMBIOS 3.0=0x3ff86000 MEMATTR=0x3f41e418 Apr 17 23:44:22.141257 kernel: SMBIOS 3.1.0 present. Apr 17 23:44:22.141264 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 01/08/2026 Apr 17 23:44:22.141271 kernel: Hypervisor detected: Microsoft Hyper-V Apr 17 23:44:22.141282 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x64e24, misc 0xbed7b2 Apr 17 23:44:22.141289 kernel: Hyper-V: Host Build 10.0.26102.1277-1-0 Apr 17 23:44:22.141296 kernel: Hyper-V: Nested features: 0x1e0101 Apr 17 23:44:22.141309 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 Apr 17 23:44:22.141316 kernel: Hyper-V: Using hypercall for remote TLB flush Apr 17 23:44:22.141323 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Apr 17 23:44:22.141333 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Apr 17 23:44:22.141342 kernel: tsc: Marking TSC unstable due to running on Hyper-V Apr 17 23:44:22.141349 kernel: tsc: Detected 2593.905 MHz processor Apr 17 23:44:22.141357 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Apr 17 23:44:22.141368 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Apr 17 23:44:22.141375 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x400000000 Apr 17 23:44:22.141389 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Apr 17 23:44:22.141396 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Apr 17 23:44:22.141408 kernel: e820: update [mem 0x40000000-0xffffffff] usable ==> reserved Apr 17 23:44:22.141415 kernel: last_pfn = 0x40000 max_arch_pfn = 0x400000000 Apr 17 23:44:22.141422 kernel: Using GB pages for direct mapping Apr 17 23:44:22.141433 kernel: Secure boot disabled Apr 17 23:44:22.141444 kernel: ACPI: Early table checksum verification disabled Apr 17 23:44:22.141458 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) Apr 17 23:44:22.141466 kernel: ACPI: XSDT 0x000000003FFF90E8 00005C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141474 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141484 kernel: ACPI: DSDT 0x000000003FFD6000 01E22B (v02 MSFTVM DSDT01 00000001 INTL 20230628) Apr 17 23:44:22.141493 kernel: ACPI: FACS 0x000000003FFFE000 000040 Apr 17 23:44:22.141501 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141509 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141522 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141530 kernel: ACPI: APIC 0x000000003FFD5000 000058 (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141541 kernel: ACPI: SRAT 0x000000003FFD4000 0001E0 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141549 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Apr 17 23:44:22.141557 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] Apr 17 23:44:22.141568 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff422a] Apr 17 23:44:22.141576 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] Apr 17 23:44:22.141583 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] Apr 17 23:44:22.141594 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] Apr 17 23:44:22.141605 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] Apr 17 23:44:22.141612 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5057] Apr 17 23:44:22.141624 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd41df] Apr 17 23:44:22.141632 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] Apr 17 23:44:22.141639 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Apr 17 23:44:22.141651 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Apr 17 23:44:22.141659 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Apr 17 23:44:22.141666 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] hotplug Apr 17 23:44:22.141678 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x2c0000000-0xfdfffffff] hotplug Apr 17 23:44:22.141688 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Apr 17 23:44:22.141699 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Apr 17 23:44:22.141707 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Apr 17 23:44:22.141719 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Apr 17 23:44:22.141726 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Apr 17 23:44:22.141734 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Apr 17 23:44:22.141745 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Apr 17 23:44:22.141753 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00000000-0x2bfffffff] Apr 17 23:44:22.141766 kernel: NODE_DATA(0) allocated [mem 0x2bfffa000-0x2bfffffff] Apr 17 23:44:22.141775 kernel: Zone ranges: Apr 17 23:44:22.141786 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Apr 17 23:44:22.141794 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Apr 17 23:44:22.141806 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] Apr 17 23:44:22.141816 kernel: Movable zone start for each node Apr 17 23:44:22.141828 kernel: Early memory node ranges Apr 17 23:44:22.141841 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Apr 17 23:44:22.141856 kernel: node 0: [mem 0x0000000000100000-0x000000000437dfff] Apr 17 23:44:22.141880 kernel: node 0: [mem 0x000000000477e000-0x000000003ff1efff] Apr 17 23:44:22.141912 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] Apr 17 23:44:22.141929 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] Apr 17 23:44:22.141946 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] Apr 17 23:44:22.141963 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Apr 17 23:44:22.141977 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Apr 17 23:44:22.141991 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Apr 17 23:44:22.142005 kernel: On node 0, zone DMA32: 224 pages in unavailable ranges Apr 17 23:44:22.142019 kernel: ACPI: PM-Timer IO Port: 0x408 Apr 17 23:44:22.142038 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] dfl dfl lint[0x1]) Apr 17 23:44:22.142053 kernel: IOAPIC[0]: apic_id 2, version 17, address 0xfec00000, GSI 0-23 Apr 17 23:44:22.142069 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Apr 17 23:44:22.142085 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Apr 17 23:44:22.142101 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 Apr 17 23:44:22.142117 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Apr 17 23:44:22.142133 kernel: [mem 0x40000000-0xffffffff] available for PCI devices Apr 17 23:44:22.142147 kernel: Booting paravirtualized kernel on Hyper-V Apr 17 23:44:22.142162 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Apr 17 23:44:22.142181 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Apr 17 23:44:22.142199 kernel: percpu: Embedded 57 pages/cpu s196328 r8192 d28952 u1048576 Apr 17 23:44:22.142213 kernel: pcpu-alloc: s196328 r8192 d28952 u1048576 alloc=1*2097152 Apr 17 23:44:22.142228 kernel: pcpu-alloc: [0] 0 1 Apr 17 23:44:22.142242 kernel: Hyper-V: PV spinlocks enabled Apr 17 23:44:22.142257 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Apr 17 23:44:22.142273 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=e69cfa144bf8cf6f0b7e7881c91c17228ba9dbcb6c99d9692bced9ddba34ee3a Apr 17 23:44:22.142288 kernel: random: crng init done Apr 17 23:44:22.142308 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Apr 17 23:44:22.142324 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Apr 17 23:44:22.142341 kernel: Fallback order for Node 0: 0 Apr 17 23:44:22.142355 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2061321 Apr 17 23:44:22.142370 kernel: Policy zone: Normal Apr 17 23:44:22.142386 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Apr 17 23:44:22.142402 kernel: software IO TLB: area num 2. Apr 17 23:44:22.142419 kernel: Memory: 8066036K/8383228K available (12288K kernel code, 2288K rwdata, 22752K rodata, 42892K init, 2304K bss, 316932K reserved, 0K cma-reserved) Apr 17 23:44:22.142436 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Apr 17 23:44:22.142475 kernel: ftrace: allocating 37996 entries in 149 pages Apr 17 23:44:22.142492 kernel: ftrace: allocated 149 pages with 4 groups Apr 17 23:44:22.142511 kernel: Dynamic Preempt: voluntary Apr 17 23:44:22.142534 kernel: rcu: Preemptible hierarchical RCU implementation. Apr 17 23:44:22.142551 kernel: rcu: RCU event tracing is enabled. Apr 17 23:44:22.142564 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Apr 17 23:44:22.142577 kernel: Trampoline variant of Tasks RCU enabled. Apr 17 23:44:22.142589 kernel: Rude variant of Tasks RCU enabled. Apr 17 23:44:22.142601 kernel: Tracing variant of Tasks RCU enabled. Apr 17 23:44:22.142617 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Apr 17 23:44:22.142629 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Apr 17 23:44:22.142643 kernel: Using NULL legacy PIC Apr 17 23:44:22.142657 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 Apr 17 23:44:22.142672 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Apr 17 23:44:22.142685 kernel: Console: colour dummy device 80x25 Apr 17 23:44:22.142700 kernel: printk: console [tty1] enabled Apr 17 23:44:22.142716 kernel: printk: console [ttyS0] enabled Apr 17 23:44:22.142738 kernel: printk: bootconsole [earlyser0] disabled Apr 17 23:44:22.142751 kernel: ACPI: Core revision 20230628 Apr 17 23:44:22.142765 kernel: Failed to register legacy timer interrupt Apr 17 23:44:22.142778 kernel: APIC: Switch to symmetric I/O mode setup Apr 17 23:44:22.142792 kernel: Hyper-V: enabling crash_kexec_post_notifiers Apr 17 23:44:22.142812 kernel: Hyper-V: Using IPI hypercalls Apr 17 23:44:22.142828 kernel: APIC: send_IPI() replaced with hv_send_ipi() Apr 17 23:44:22.142840 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() Apr 17 23:44:22.142854 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() Apr 17 23:44:22.142871 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() Apr 17 23:44:22.142884 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() Apr 17 23:44:22.142914 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() Apr 17 23:44:22.142930 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 5187.81 BogoMIPS (lpj=2593905) Apr 17 23:44:22.142946 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Apr 17 23:44:22.142960 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Apr 17 23:44:22.142973 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Apr 17 23:44:22.142986 kernel: Spectre V2 : Mitigation: Retpolines Apr 17 23:44:22.142998 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Apr 17 23:44:22.143011 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Apr 17 23:44:22.143028 kernel: RETBleed: Vulnerable Apr 17 23:44:22.143041 kernel: Speculative Store Bypass: Vulnerable Apr 17 23:44:22.143054 kernel: TAA: Vulnerable: Clear CPU buffers attempted, no microcode Apr 17 23:44:22.143068 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Apr 17 23:44:22.143080 kernel: active return thunk: its_return_thunk Apr 17 23:44:22.143094 kernel: ITS: Mitigation: Aligned branch/return thunks Apr 17 23:44:22.143107 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Apr 17 23:44:22.143120 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Apr 17 23:44:22.143134 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Apr 17 23:44:22.143147 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Apr 17 23:44:22.143163 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Apr 17 23:44:22.143176 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Apr 17 23:44:22.143190 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Apr 17 23:44:22.143203 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Apr 17 23:44:22.143217 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Apr 17 23:44:22.143230 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Apr 17 23:44:22.143243 kernel: x86/fpu: Enabled xstate features 0xe7, context size is 2432 bytes, using 'compacted' format. Apr 17 23:44:22.143255 kernel: Freeing SMP alternatives memory: 32K Apr 17 23:44:22.143267 kernel: pid_max: default: 32768 minimum: 301 Apr 17 23:44:22.143281 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Apr 17 23:44:22.143296 kernel: landlock: Up and running. Apr 17 23:44:22.143310 kernel: SELinux: Initializing. Apr 17 23:44:22.143327 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Apr 17 23:44:22.143341 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Apr 17 23:44:22.143356 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8272CL CPU @ 2.60GHz (family: 0x6, model: 0x55, stepping: 0x7) Apr 17 23:44:22.143371 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 17 23:44:22.143386 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 17 23:44:22.143400 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 17 23:44:22.143416 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Apr 17 23:44:22.143428 kernel: signal: max sigframe size: 3632 Apr 17 23:44:22.143443 kernel: rcu: Hierarchical SRCU implementation. Apr 17 23:44:22.143460 kernel: rcu: Max phase no-delay instances is 400. Apr 17 23:44:22.143475 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Apr 17 23:44:22.143488 kernel: smp: Bringing up secondary CPUs ... Apr 17 23:44:22.143502 kernel: smpboot: x86: Booting SMP configuration: Apr 17 23:44:22.143518 kernel: .... node #0, CPUs: #1 Apr 17 23:44:22.143531 kernel: TAA CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/tsx_async_abort.html for more details. Apr 17 23:44:22.143546 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Apr 17 23:44:22.143560 kernel: smp: Brought up 1 node, 2 CPUs Apr 17 23:44:22.143573 kernel: smpboot: Max logical packages: 1 Apr 17 23:44:22.143590 kernel: smpboot: Total of 2 processors activated (10375.62 BogoMIPS) Apr 17 23:44:22.143604 kernel: devtmpfs: initialized Apr 17 23:44:22.143618 kernel: x86/mm: Memory block size: 128MB Apr 17 23:44:22.143633 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) Apr 17 23:44:22.143648 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Apr 17 23:44:22.143664 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Apr 17 23:44:22.143677 kernel: pinctrl core: initialized pinctrl subsystem Apr 17 23:44:22.143691 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Apr 17 23:44:22.143706 kernel: audit: initializing netlink subsys (disabled) Apr 17 23:44:22.143725 kernel: audit: type=2000 audit(1776469460.029:1): state=initialized audit_enabled=0 res=1 Apr 17 23:44:22.143740 kernel: thermal_sys: Registered thermal governor 'step_wise' Apr 17 23:44:22.143756 kernel: thermal_sys: Registered thermal governor 'user_space' Apr 17 23:44:22.143771 kernel: cpuidle: using governor menu Apr 17 23:44:22.143784 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Apr 17 23:44:22.143797 kernel: dca service started, version 1.12.1 Apr 17 23:44:22.143811 kernel: e820: reserve RAM buffer [mem 0x0437e000-0x07ffffff] Apr 17 23:44:22.143824 kernel: e820: reserve RAM buffer [mem 0x3ff1f000-0x3fffffff] Apr 17 23:44:22.143839 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Apr 17 23:44:22.143857 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Apr 17 23:44:22.143873 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Apr 17 23:44:22.143886 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Apr 17 23:44:22.143919 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Apr 17 23:44:22.143934 kernel: ACPI: Added _OSI(Module Device) Apr 17 23:44:22.143950 kernel: ACPI: Added _OSI(Processor Device) Apr 17 23:44:22.143966 kernel: ACPI: Added _OSI(Processor Aggregator Device) Apr 17 23:44:22.143984 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Apr 17 23:44:22.144004 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Apr 17 23:44:22.144018 kernel: ACPI: Interpreter enabled Apr 17 23:44:22.144031 kernel: ACPI: PM: (supports S0 S5) Apr 17 23:44:22.144046 kernel: ACPI: Using IOAPIC for interrupt routing Apr 17 23:44:22.144069 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Apr 17 23:44:22.144082 kernel: PCI: Ignoring E820 reservations for host bridge windows Apr 17 23:44:22.144096 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F Apr 17 23:44:22.144108 kernel: iommu: Default domain type: Translated Apr 17 23:44:22.144124 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Apr 17 23:44:22.144138 kernel: efivars: Registered efivars operations Apr 17 23:44:22.144157 kernel: PCI: Using ACPI for IRQ routing Apr 17 23:44:22.144172 kernel: PCI: System does not support PCI Apr 17 23:44:22.144187 kernel: vgaarb: loaded Apr 17 23:44:22.144202 kernel: clocksource: Switched to clocksource hyperv_clocksource_tsc_page Apr 17 23:44:22.144217 kernel: VFS: Disk quotas dquot_6.6.0 Apr 17 23:44:22.144232 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Apr 17 23:44:22.144247 kernel: pnp: PnP ACPI init Apr 17 23:44:22.144262 kernel: pnp: PnP ACPI: found 3 devices Apr 17 23:44:22.144277 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Apr 17 23:44:22.144294 kernel: NET: Registered PF_INET protocol family Apr 17 23:44:22.144307 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Apr 17 23:44:22.144321 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Apr 17 23:44:22.144335 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Apr 17 23:44:22.144350 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Apr 17 23:44:22.144366 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Apr 17 23:44:22.144381 kernel: TCP: Hash tables configured (established 65536 bind 65536) Apr 17 23:44:22.144396 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Apr 17 23:44:22.144411 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Apr 17 23:44:22.144426 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Apr 17 23:44:22.144439 kernel: NET: Registered PF_XDP protocol family Apr 17 23:44:22.144452 kernel: PCI: CLS 0 bytes, default 64 Apr 17 23:44:22.144467 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Apr 17 23:44:22.144479 kernel: software IO TLB: mapped [mem 0x000000003a878000-0x000000003e878000] (64MB) Apr 17 23:44:22.144493 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Apr 17 23:44:22.144508 kernel: Initialise system trusted keyrings Apr 17 23:44:22.144522 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Apr 17 23:44:22.144539 kernel: Key type asymmetric registered Apr 17 23:44:22.144553 kernel: Asymmetric key parser 'x509' registered Apr 17 23:44:22.144567 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Apr 17 23:44:22.144581 kernel: io scheduler mq-deadline registered Apr 17 23:44:22.144594 kernel: io scheduler kyber registered Apr 17 23:44:22.144608 kernel: io scheduler bfq registered Apr 17 23:44:22.144623 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Apr 17 23:44:22.144637 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Apr 17 23:44:22.144652 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Apr 17 23:44:22.144667 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Apr 17 23:44:22.144686 kernel: i8042: PNP: No PS/2 controller found. Apr 17 23:44:22.144887 kernel: rtc_cmos 00:02: registered as rtc0 Apr 17 23:44:22.150099 kernel: rtc_cmos 00:02: setting system clock to 2026-04-17T23:44:21 UTC (1776469461) Apr 17 23:44:22.150198 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram Apr 17 23:44:22.150213 kernel: intel_pstate: CPU model not supported Apr 17 23:44:22.150223 kernel: efifb: probing for efifb Apr 17 23:44:22.150232 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Apr 17 23:44:22.150248 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Apr 17 23:44:22.150257 kernel: efifb: scrolling: redraw Apr 17 23:44:22.150265 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Apr 17 23:44:22.150278 kernel: Console: switching to colour frame buffer device 128x48 Apr 17 23:44:22.150286 kernel: fb0: EFI VGA frame buffer device Apr 17 23:44:22.150298 kernel: pstore: Using crash dump compression: deflate Apr 17 23:44:22.150307 kernel: pstore: Registered efi_pstore as persistent store backend Apr 17 23:44:22.150315 kernel: NET: Registered PF_INET6 protocol family Apr 17 23:44:22.150326 kernel: Segment Routing with IPv6 Apr 17 23:44:22.150339 kernel: In-situ OAM (IOAM) with IPv6 Apr 17 23:44:22.150347 kernel: NET: Registered PF_PACKET protocol family Apr 17 23:44:22.150360 kernel: Key type dns_resolver registered Apr 17 23:44:22.150368 kernel: IPI shorthand broadcast: enabled Apr 17 23:44:22.150376 kernel: sched_clock: Marking stable (897003300, 52245900)->(1185621200, -236372000) Apr 17 23:44:22.150388 kernel: registered taskstats version 1 Apr 17 23:44:22.150397 kernel: Loading compiled-in X.509 certificates Apr 17 23:44:22.150405 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: 39e9969c7f49062f0fc1d1fb72e8f874436eb94f' Apr 17 23:44:22.150417 kernel: Key type .fscrypt registered Apr 17 23:44:22.150428 kernel: Key type fscrypt-provisioning registered Apr 17 23:44:22.150440 kernel: ima: No TPM chip found, activating TPM-bypass! Apr 17 23:44:22.150448 kernel: ima: Allocated hash algorithm: sha1 Apr 17 23:44:22.150460 kernel: ima: No architecture policies found Apr 17 23:44:22.150469 kernel: clk: Disabling unused clocks Apr 17 23:44:22.150477 kernel: Freeing unused kernel image (initmem) memory: 42892K Apr 17 23:44:22.150488 kernel: Write protecting the kernel read-only data: 36864k Apr 17 23:44:22.150498 kernel: Freeing unused kernel image (rodata/data gap) memory: 1824K Apr 17 23:44:22.150506 kernel: Run /init as init process Apr 17 23:44:22.150521 kernel: with arguments: Apr 17 23:44:22.150529 kernel: /init Apr 17 23:44:22.150537 kernel: with environment: Apr 17 23:44:22.150549 kernel: HOME=/ Apr 17 23:44:22.150557 kernel: TERM=linux Apr 17 23:44:22.150568 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 17 23:44:22.150583 systemd[1]: Detected virtualization microsoft. Apr 17 23:44:22.150591 systemd[1]: Detected architecture x86-64. Apr 17 23:44:22.150606 systemd[1]: Running in initrd. Apr 17 23:44:22.150615 systemd[1]: No hostname configured, using default hostname. Apr 17 23:44:22.150623 systemd[1]: Hostname set to . Apr 17 23:44:22.150637 systemd[1]: Initializing machine ID from random generator. Apr 17 23:44:22.150645 systemd[1]: Queued start job for default target initrd.target. Apr 17 23:44:22.150654 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 17 23:44:22.150667 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 17 23:44:22.150676 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Apr 17 23:44:22.150691 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 17 23:44:22.150700 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Apr 17 23:44:22.150709 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Apr 17 23:44:22.150724 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Apr 17 23:44:22.150733 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Apr 17 23:44:22.150745 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 17 23:44:22.150754 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 17 23:44:22.150766 systemd[1]: Reached target paths.target - Path Units. Apr 17 23:44:22.150776 systemd[1]: Reached target slices.target - Slice Units. Apr 17 23:44:22.150787 systemd[1]: Reached target swap.target - Swaps. Apr 17 23:44:22.150796 systemd[1]: Reached target timers.target - Timer Units. Apr 17 23:44:22.150808 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Apr 17 23:44:22.150817 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 17 23:44:22.150826 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Apr 17 23:44:22.150837 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Apr 17 23:44:22.150848 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 17 23:44:22.150859 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 17 23:44:22.150867 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 17 23:44:22.150876 systemd[1]: Reached target sockets.target - Socket Units. Apr 17 23:44:22.150885 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Apr 17 23:44:22.150902 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 17 23:44:22.150910 systemd[1]: Finished network-cleanup.service - Network Cleanup. Apr 17 23:44:22.150919 systemd[1]: Starting systemd-fsck-usr.service... Apr 17 23:44:22.150928 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 17 23:44:22.150939 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 17 23:44:22.150969 systemd-journald[177]: Collecting audit messages is disabled. Apr 17 23:44:22.150994 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 23:44:22.151005 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Apr 17 23:44:22.151020 systemd-journald[177]: Journal started Apr 17 23:44:22.151043 systemd-journald[177]: Runtime Journal (/run/log/journal/173f793e28074cde90a83b4d3406971a) is 8.0M, max 158.7M, 150.7M free. Apr 17 23:44:22.163336 systemd[1]: Started systemd-journald.service - Journal Service. Apr 17 23:44:22.168406 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 17 23:44:22.176854 systemd[1]: Finished systemd-fsck-usr.service. Apr 17 23:44:22.179595 systemd-modules-load[178]: Inserted module 'overlay' Apr 17 23:44:22.185790 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:22.204138 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 17 23:44:22.213524 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 17 23:44:22.217975 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 17 23:44:22.242733 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Apr 17 23:44:22.248345 kernel: Bridge firewalling registered Apr 17 23:44:22.248156 systemd-modules-load[178]: Inserted module 'br_netfilter' Apr 17 23:44:22.248814 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 17 23:44:22.251524 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 17 23:44:22.266784 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 17 23:44:22.274430 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 17 23:44:22.283160 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 17 23:44:22.287164 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 17 23:44:22.302118 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Apr 17 23:44:22.313115 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 17 23:44:22.319585 dracut-cmdline[207]: dracut-dracut-053 Apr 17 23:44:22.319585 dracut-cmdline[207]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=e69cfa144bf8cf6f0b7e7881c91c17228ba9dbcb6c99d9692bced9ddba34ee3a Apr 17 23:44:22.354415 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 17 23:44:22.369051 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 17 23:44:22.413915 kernel: SCSI subsystem initialized Apr 17 23:44:22.416037 systemd-resolved[265]: Positive Trust Anchors: Apr 17 23:44:22.416057 systemd-resolved[265]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 17 23:44:22.416113 systemd-resolved[265]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 17 23:44:22.454241 kernel: Loading iSCSI transport class v2.0-870. Apr 17 23:44:22.445380 systemd-resolved[265]: Defaulting to hostname 'linux'. Apr 17 23:44:22.446733 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 17 23:44:22.450611 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 17 23:44:22.466911 kernel: iscsi: registered transport (tcp) Apr 17 23:44:22.489621 kernel: iscsi: registered transport (qla4xxx) Apr 17 23:44:22.489699 kernel: QLogic iSCSI HBA Driver Apr 17 23:44:22.526468 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Apr 17 23:44:22.542073 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Apr 17 23:44:22.571806 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Apr 17 23:44:22.571917 kernel: device-mapper: uevent: version 1.0.3 Apr 17 23:44:22.575820 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Apr 17 23:44:22.617922 kernel: raid6: avx512x4 gen() 18208 MB/s Apr 17 23:44:22.636901 kernel: raid6: avx512x2 gen() 18174 MB/s Apr 17 23:44:22.655900 kernel: raid6: avx512x1 gen() 18260 MB/s Apr 17 23:44:22.675909 kernel: raid6: avx2x4 gen() 18187 MB/s Apr 17 23:44:22.694901 kernel: raid6: avx2x2 gen() 18209 MB/s Apr 17 23:44:22.715602 kernel: raid6: avx2x1 gen() 13690 MB/s Apr 17 23:44:22.715632 kernel: raid6: using algorithm avx512x1 gen() 18260 MB/s Apr 17 23:44:22.738059 kernel: raid6: .... xor() 26904 MB/s, rmw enabled Apr 17 23:44:22.738089 kernel: raid6: using avx512x2 recovery algorithm Apr 17 23:44:22.759918 kernel: xor: automatically using best checksumming function avx Apr 17 23:44:22.907919 kernel: Btrfs loaded, zoned=no, fsverity=no Apr 17 23:44:22.917305 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Apr 17 23:44:22.928155 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 17 23:44:22.946938 systemd-udevd[396]: Using default interface naming scheme 'v255'. Apr 17 23:44:22.951553 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 17 23:44:22.966100 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Apr 17 23:44:22.983469 dracut-pre-trigger[407]: rd.md=0: removing MD RAID activation Apr 17 23:44:23.011059 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Apr 17 23:44:23.026091 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 17 23:44:23.070049 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 17 23:44:23.082126 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Apr 17 23:44:23.114156 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Apr 17 23:44:23.122964 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Apr 17 23:44:23.135093 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 17 23:44:23.142218 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 17 23:44:23.153941 kernel: cryptd: max_cpu_qlen set to 1000 Apr 17 23:44:23.156316 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Apr 17 23:44:23.185388 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Apr 17 23:44:23.202910 kernel: hv_vmbus: Vmbus version:5.2 Apr 17 23:44:23.202971 kernel: AVX2 version of gcm_enc/dec engaged. Apr 17 23:44:23.207158 kernel: AES CTR mode by8 optimization enabled Apr 17 23:44:23.214229 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 17 23:44:23.214364 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 17 23:44:23.226169 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 17 23:44:23.246160 kernel: pps_core: LinuxPPS API ver. 1 registered Apr 17 23:44:23.246188 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Apr 17 23:44:23.238263 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 17 23:44:23.238456 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:23.246117 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 23:44:23.265241 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 23:44:23.291630 kernel: PTP clock support registered Apr 17 23:44:23.296158 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 17 23:44:23.305491 kernel: hv_vmbus: registering driver hyperv_keyboard Apr 17 23:44:23.297056 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:23.314989 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Apr 17 23:44:23.320142 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 23:44:23.332905 kernel: hid: raw HID events driver (C) Jiri Kosina Apr 17 23:44:23.339049 kernel: hv_utils: Registering HyperV Utility Driver Apr 17 23:44:23.339080 kernel: hv_vmbus: registering driver hv_utils Apr 17 23:44:23.342600 kernel: hv_vmbus: registering driver hv_storvsc Apr 17 23:44:23.342629 kernel: hv_vmbus: registering driver hv_netvsc Apr 17 23:44:23.355916 kernel: scsi host0: storvsc_host_t Apr 17 23:44:23.355969 kernel: scsi host1: storvsc_host_t Apr 17 23:44:23.361886 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Apr 17 23:44:23.364455 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:23.375011 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 Apr 17 23:44:23.382020 kernel: hv_utils: Shutdown IC version 3.2 Apr 17 23:44:23.382060 kernel: hv_utils: TimeSync IC version 4.0 Apr 17 23:44:23.384396 kernel: hv_utils: Heartbeat IC version 3.0 Apr 17 23:44:23.384424 kernel: hv_vmbus: registering driver hid_hyperv Apr 17 23:44:23.972977 systemd-resolved[265]: Clock change detected. Flushing caches. Apr 17 23:44:23.980527 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Apr 17 23:44:23.980718 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 17 23:44:23.991582 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Apr 17 23:44:24.013016 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Apr 17 23:44:24.013216 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Apr 17 23:44:24.016496 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Apr 17 23:44:24.033419 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Apr 17 23:44:24.033746 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Apr 17 23:44:24.031370 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 17 23:44:24.043047 kernel: sd 0:0:0:0: [sda] Write Protect is off Apr 17 23:44:24.043220 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Apr 17 23:44:24.043327 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Apr 17 23:44:24.052052 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 17 23:44:24.052093 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Apr 17 23:44:24.062503 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#210 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Apr 17 23:44:24.062670 kernel: hv_netvsc 000d3ada-bb1d-000d-3ada-bb1d000d3ada eth0: VF slot 1 added Apr 17 23:44:24.076743 kernel: hv_vmbus: registering driver hv_pci Apr 17 23:44:24.076797 kernel: hv_pci 158243a4-d954-4bec-bd98-a32beb36bf61: PCI VMBus probing: Using version 0x10004 Apr 17 23:44:24.086493 kernel: hv_pci 158243a4-d954-4bec-bd98-a32beb36bf61: PCI host bridge to bus d954:00 Apr 17 23:44:24.095347 kernel: pci_bus d954:00: root bus resource [mem 0xfe0000000-0xfe00fffff window] Apr 17 23:44:24.095605 kernel: pci_bus d954:00: No busn resource found for root bus, will use [bus 00-ff] Apr 17 23:44:24.108545 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#215 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Apr 17 23:44:24.108720 kernel: pci d954:00:02.0: [15b3:1016] type 00 class 0x020000 Apr 17 23:44:24.113526 kernel: pci d954:00:02.0: reg 0x10: [mem 0xfe0000000-0xfe00fffff 64bit pref] Apr 17 23:44:24.118565 kernel: pci d954:00:02.0: enabling Extended Tags Apr 17 23:44:24.131515 kernel: pci d954:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at d954:00:02.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) Apr 17 23:44:24.138453 kernel: pci_bus d954:00: busn_res: [bus 00-ff] end is updated to 00 Apr 17 23:44:24.138753 kernel: pci d954:00:02.0: BAR 0: assigned [mem 0xfe0000000-0xfe00fffff 64bit pref] Apr 17 23:44:24.304039 kernel: mlx5_core d954:00:02.0: enabling device (0000 -> 0002) Apr 17 23:44:24.308502 kernel: mlx5_core d954:00:02.0: firmware version: 14.30.5026 Apr 17 23:44:24.518457 kernel: hv_netvsc 000d3ada-bb1d-000d-3ada-bb1d000d3ada eth0: VF registering: eth1 Apr 17 23:44:24.518826 kernel: mlx5_core d954:00:02.0 eth1: joined to eth0 Apr 17 23:44:24.524542 kernel: mlx5_core d954:00:02.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Apr 17 23:44:24.534503 kernel: mlx5_core d954:00:02.0 enP55636s1: renamed from eth1 Apr 17 23:44:24.565504 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by (udev-worker) (441) Apr 17 23:44:24.590280 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Apr 17 23:44:24.602777 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Apr 17 23:44:24.647742 kernel: BTRFS: device fsid 81b0bf8a-1550-4880-b72f-76fa51dbb6c0 devid 1 transid 32 /dev/sda3 scanned by (udev-worker) (442) Apr 17 23:44:24.661276 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Apr 17 23:44:24.678048 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Apr 17 23:44:24.681957 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Apr 17 23:44:24.696665 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Apr 17 23:44:24.717539 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 17 23:44:24.725494 kernel: GPT:disk_guids don't match. Apr 17 23:44:24.725537 kernel: GPT: Use GNU Parted to correct GPT errors. Apr 17 23:44:24.725551 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 17 23:44:24.737498 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 17 23:44:25.738501 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 17 23:44:25.739046 disk-uuid[608]: The operation has completed successfully. Apr 17 23:44:25.814659 systemd[1]: disk-uuid.service: Deactivated successfully. Apr 17 23:44:25.814777 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Apr 17 23:44:25.842666 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Apr 17 23:44:25.850130 sh[721]: Success Apr 17 23:44:25.879518 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Apr 17 23:44:26.138062 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Apr 17 23:44:26.151627 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Apr 17 23:44:26.157237 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Apr 17 23:44:26.176226 kernel: BTRFS info (device dm-0): first mount of filesystem 81b0bf8a-1550-4880-b72f-76fa51dbb6c0 Apr 17 23:44:26.176299 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Apr 17 23:44:26.180345 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Apr 17 23:44:26.184291 kernel: BTRFS info (device dm-0): disabling log replay at mount time Apr 17 23:44:26.187631 kernel: BTRFS info (device dm-0): using free space tree Apr 17 23:44:26.436880 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Apr 17 23:44:26.437807 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Apr 17 23:44:26.450741 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Apr 17 23:44:26.459437 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Apr 17 23:44:26.487446 kernel: BTRFS info (device sda6): first mount of filesystem a5a0fe13-59ac-4c21-ab23-7fd1bfa02f60 Apr 17 23:44:26.487524 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Apr 17 23:44:26.487548 kernel: BTRFS info (device sda6): using free space tree Apr 17 23:44:26.521507 kernel: BTRFS info (device sda6): auto enabling async discard Apr 17 23:44:26.539149 kernel: BTRFS info (device sda6): last unmount of filesystem a5a0fe13-59ac-4c21-ab23-7fd1bfa02f60 Apr 17 23:44:26.538759 systemd[1]: mnt-oem.mount: Deactivated successfully. Apr 17 23:44:26.552904 systemd[1]: Finished ignition-setup.service - Ignition (setup). Apr 17 23:44:26.561663 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Apr 17 23:44:26.569498 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 17 23:44:26.577639 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 17 23:44:26.601277 systemd-networkd[905]: lo: Link UP Apr 17 23:44:26.601289 systemd-networkd[905]: lo: Gained carrier Apr 17 23:44:26.603503 systemd-networkd[905]: Enumeration completed Apr 17 23:44:26.603613 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 17 23:44:26.604161 systemd[1]: Reached target network.target - Network. Apr 17 23:44:26.606028 systemd-networkd[905]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 17 23:44:26.606033 systemd-networkd[905]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 17 23:44:26.681502 kernel: mlx5_core d954:00:02.0 enP55636s1: Link up Apr 17 23:44:26.712505 kernel: hv_netvsc 000d3ada-bb1d-000d-3ada-bb1d000d3ada eth0: Data path switched to VF: enP55636s1 Apr 17 23:44:26.712741 systemd-networkd[905]: enP55636s1: Link UP Apr 17 23:44:26.712868 systemd-networkd[905]: eth0: Link UP Apr 17 23:44:26.713034 systemd-networkd[905]: eth0: Gained carrier Apr 17 23:44:26.713047 systemd-networkd[905]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 17 23:44:26.717706 systemd-networkd[905]: enP55636s1: Gained carrier Apr 17 23:44:26.757904 systemd-networkd[905]: eth0: DHCPv4 address 10.0.0.10/24, gateway 10.0.0.1 acquired from 168.63.129.16 Apr 17 23:44:27.569549 ignition[903]: Ignition 2.19.0 Apr 17 23:44:27.569562 ignition[903]: Stage: fetch-offline Apr 17 23:44:27.569619 ignition[903]: no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:27.573871 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Apr 17 23:44:27.569631 ignition[903]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:27.569770 ignition[903]: parsed url from cmdline: "" Apr 17 23:44:27.569775 ignition[903]: no config URL provided Apr 17 23:44:27.569782 ignition[903]: reading system config file "/usr/lib/ignition/user.ign" Apr 17 23:44:27.569794 ignition[903]: no config at "/usr/lib/ignition/user.ign" Apr 17 23:44:27.569800 ignition[903]: failed to fetch config: resource requires networking Apr 17 23:44:27.571882 ignition[903]: Ignition finished successfully Apr 17 23:44:27.600770 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Apr 17 23:44:27.618357 ignition[913]: Ignition 2.19.0 Apr 17 23:44:27.618371 ignition[913]: Stage: fetch Apr 17 23:44:27.618626 ignition[913]: no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:27.618640 ignition[913]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:27.620826 ignition[913]: parsed url from cmdline: "" Apr 17 23:44:27.620832 ignition[913]: no config URL provided Apr 17 23:44:27.620840 ignition[913]: reading system config file "/usr/lib/ignition/user.ign" Apr 17 23:44:27.620853 ignition[913]: no config at "/usr/lib/ignition/user.ign" Apr 17 23:44:27.620880 ignition[913]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Apr 17 23:44:27.686689 ignition[913]: GET result: OK Apr 17 23:44:27.686781 ignition[913]: config has been read from IMDS userdata Apr 17 23:44:27.686814 ignition[913]: parsing config with SHA512: 103ab94cbf6e3c53b6af9eda365f9c901f8901394d3dca1ae5a5a7ff1bf7aeb727897437c1fb92c5664f1c06d3dd11a75077c18da4ffc980d15d233748fdc04f Apr 17 23:44:27.691001 unknown[913]: fetched base config from "system" Apr 17 23:44:27.691444 ignition[913]: fetch: fetch complete Apr 17 23:44:27.691013 unknown[913]: fetched base config from "system" Apr 17 23:44:27.691449 ignition[913]: fetch: fetch passed Apr 17 23:44:27.691018 unknown[913]: fetched user config from "azure" Apr 17 23:44:27.691517 ignition[913]: Ignition finished successfully Apr 17 23:44:27.706659 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Apr 17 23:44:27.716661 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Apr 17 23:44:27.733670 ignition[919]: Ignition 2.19.0 Apr 17 23:44:27.733684 ignition[919]: Stage: kargs Apr 17 23:44:27.733916 ignition[919]: no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:27.737726 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Apr 17 23:44:27.733929 ignition[919]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:27.735282 ignition[919]: kargs: kargs passed Apr 17 23:44:27.751224 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Apr 17 23:44:27.735335 ignition[919]: Ignition finished successfully Apr 17 23:44:27.766626 ignition[925]: Ignition 2.19.0 Apr 17 23:44:27.766638 ignition[925]: Stage: disks Apr 17 23:44:27.766850 ignition[925]: no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:27.769340 systemd[1]: Finished ignition-disks.service - Ignition (disks). Apr 17 23:44:27.766863 ignition[925]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:27.771868 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Apr 17 23:44:27.767820 ignition[925]: disks: disks passed Apr 17 23:44:27.772513 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Apr 17 23:44:27.767862 ignition[925]: Ignition finished successfully Apr 17 23:44:27.772974 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 17 23:44:27.773458 systemd[1]: Reached target sysinit.target - System Initialization. Apr 17 23:44:27.774453 systemd[1]: Reached target basic.target - Basic System. Apr 17 23:44:27.799713 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Apr 17 23:44:27.896784 systemd-fsck[933]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks Apr 17 23:44:27.904158 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Apr 17 23:44:27.915668 systemd[1]: Mounting sysroot.mount - /sysroot... Apr 17 23:44:28.009502 kernel: EXT4-fs (sda9): mounted filesystem d3c199f8-8065-4f33-a75b-da2f09d4fc39 r/w with ordered data mode. Quota mode: none. Apr 17 23:44:28.010372 systemd[1]: Mounted sysroot.mount - /sysroot. Apr 17 23:44:28.013672 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Apr 17 23:44:28.057570 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 17 23:44:28.080856 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (944) Apr 17 23:44:28.080932 kernel: BTRFS info (device sda6): first mount of filesystem a5a0fe13-59ac-4c21-ab23-7fd1bfa02f60 Apr 17 23:44:28.080586 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Apr 17 23:44:28.089338 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Apr 17 23:44:28.089367 kernel: BTRFS info (device sda6): using free space tree Apr 17 23:44:28.098661 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Apr 17 23:44:28.109109 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Apr 17 23:44:28.116007 kernel: BTRFS info (device sda6): auto enabling async discard Apr 17 23:44:28.109158 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Apr 17 23:44:28.124404 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Apr 17 23:44:28.130158 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 17 23:44:28.144626 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Apr 17 23:44:28.371758 systemd-networkd[905]: eth0: Gained IPv6LL Apr 17 23:44:28.678791 coreos-metadata[948]: Apr 17 23:44:28.678 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Apr 17 23:44:28.685293 coreos-metadata[948]: Apr 17 23:44:28.685 INFO Fetch successful Apr 17 23:44:28.688606 coreos-metadata[948]: Apr 17 23:44:28.685 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Apr 17 23:44:28.705494 coreos-metadata[948]: Apr 17 23:44:28.705 INFO Fetch successful Apr 17 23:44:28.720546 coreos-metadata[948]: Apr 17 23:44:28.720 INFO wrote hostname ci-4081.3.6-n-7b570e9a3c to /sysroot/etc/hostname Apr 17 23:44:28.730710 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 17 23:44:29.141612 initrd-setup-root[973]: cut: /sysroot/etc/passwd: No such file or directory Apr 17 23:44:29.165200 initrd-setup-root[980]: cut: /sysroot/etc/group: No such file or directory Apr 17 23:44:29.170718 initrd-setup-root[987]: cut: /sysroot/etc/shadow: No such file or directory Apr 17 23:44:29.178287 initrd-setup-root[994]: cut: /sysroot/etc/gshadow: No such file or directory Apr 17 23:44:30.130899 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Apr 17 23:44:30.140717 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Apr 17 23:44:30.149667 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Apr 17 23:44:30.156837 kernel: BTRFS info (device sda6): last unmount of filesystem a5a0fe13-59ac-4c21-ab23-7fd1bfa02f60 Apr 17 23:44:30.161248 systemd[1]: sysroot-oem.mount: Deactivated successfully. Apr 17 23:44:30.189968 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Apr 17 23:44:30.192992 ignition[1061]: INFO : Ignition 2.19.0 Apr 17 23:44:30.192992 ignition[1061]: INFO : Stage: mount Apr 17 23:44:30.192992 ignition[1061]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:30.192992 ignition[1061]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:30.210829 ignition[1061]: INFO : mount: mount passed Apr 17 23:44:30.210829 ignition[1061]: INFO : Ignition finished successfully Apr 17 23:44:30.198967 systemd[1]: Finished ignition-mount.service - Ignition (mount). Apr 17 23:44:30.216643 systemd[1]: Starting ignition-files.service - Ignition (files)... Apr 17 23:44:30.235679 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 17 23:44:30.252498 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 scanned by mount (1073) Apr 17 23:44:30.259426 kernel: BTRFS info (device sda6): first mount of filesystem a5a0fe13-59ac-4c21-ab23-7fd1bfa02f60 Apr 17 23:44:30.259475 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Apr 17 23:44:30.262127 kernel: BTRFS info (device sda6): using free space tree Apr 17 23:44:30.268499 kernel: BTRFS info (device sda6): auto enabling async discard Apr 17 23:44:30.270512 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 17 23:44:30.301779 ignition[1090]: INFO : Ignition 2.19.0 Apr 17 23:44:30.301779 ignition[1090]: INFO : Stage: files Apr 17 23:44:30.306604 ignition[1090]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:30.306604 ignition[1090]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:30.306604 ignition[1090]: DEBUG : files: compiled without relabeling support, skipping Apr 17 23:44:30.317831 ignition[1090]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Apr 17 23:44:30.317831 ignition[1090]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Apr 17 23:44:30.460829 ignition[1090]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Apr 17 23:44:30.465188 ignition[1090]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Apr 17 23:44:30.469453 unknown[1090]: wrote ssh authorized keys file for user: core Apr 17 23:44:30.472773 ignition[1090]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Apr 17 23:44:30.472773 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Apr 17 23:44:30.472773 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Apr 17 23:44:30.472773 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Apr 17 23:44:30.472773 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Apr 17 23:44:30.796438 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Apr 17 23:44:30.915880 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Apr 17 23:44:30.921900 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/install.sh" Apr 17 23:44:30.927513 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/install.sh" Apr 17 23:44:30.927513 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nginx.yaml" Apr 17 23:44:30.938739 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nginx.yaml" Apr 17 23:44:30.944010 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 17 23:44:30.949625 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 17 23:44:30.955210 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/flatcar/update.conf" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/flatcar/update.conf" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Apr 17 23:44:30.960604 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-x86-64.raw: attempt #1 Apr 17 23:44:31.264559 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET result: OK Apr 17 23:44:31.618355 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Apr 17 23:44:31.618355 ignition[1090]: INFO : files: op(c): [started] processing unit "containerd.service" Apr 17 23:44:31.640780 ignition[1090]: INFO : files: op(c): op(d): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(c): op(d): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(c): [finished] processing unit "containerd.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(e): [started] processing unit "prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(e): op(f): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(e): op(f): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(e): [finished] processing unit "prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(10): [started] setting preset to enabled for "prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: op(10): [finished] setting preset to enabled for "prepare-helm.service" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: createResultFile: createFiles: op(11): [started] writing file "/sysroot/etc/.ignition-result.json" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: createResultFile: createFiles: op(11): [finished] writing file "/sysroot/etc/.ignition-result.json" Apr 17 23:44:31.652094 ignition[1090]: INFO : files: files passed Apr 17 23:44:31.652094 ignition[1090]: INFO : Ignition finished successfully Apr 17 23:44:31.646366 systemd[1]: Finished ignition-files.service - Ignition (files). Apr 17 23:44:31.667708 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Apr 17 23:44:31.681070 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Apr 17 23:44:31.691230 systemd[1]: ignition-quench.service: Deactivated successfully. Apr 17 23:44:31.691332 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Apr 17 23:44:31.742805 initrd-setup-root-after-ignition[1118]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 17 23:44:31.742805 initrd-setup-root-after-ignition[1118]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Apr 17 23:44:31.753064 initrd-setup-root-after-ignition[1122]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 17 23:44:31.754949 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 17 23:44:31.762696 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Apr 17 23:44:31.776732 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Apr 17 23:44:31.808685 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Apr 17 23:44:31.808811 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Apr 17 23:44:31.815653 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Apr 17 23:44:31.822285 systemd[1]: Reached target initrd.target - Initrd Default Target. Apr 17 23:44:31.825693 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Apr 17 23:44:31.837647 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Apr 17 23:44:31.853567 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 17 23:44:31.864671 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Apr 17 23:44:31.879018 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Apr 17 23:44:31.886511 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 17 23:44:31.890347 systemd[1]: Stopped target timers.target - Timer Units. Apr 17 23:44:31.896834 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Apr 17 23:44:31.897019 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 17 23:44:31.903813 systemd[1]: Stopped target initrd.target - Initrd Default Target. Apr 17 23:44:31.907113 systemd[1]: Stopped target basic.target - Basic System. Apr 17 23:44:31.913008 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Apr 17 23:44:31.928535 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Apr 17 23:44:31.935502 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Apr 17 23:44:31.935697 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Apr 17 23:44:31.936203 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Apr 17 23:44:31.936736 systemd[1]: Stopped target sysinit.target - System Initialization. Apr 17 23:44:31.937326 systemd[1]: Stopped target local-fs.target - Local File Systems. Apr 17 23:44:31.938366 systemd[1]: Stopped target swap.target - Swaps. Apr 17 23:44:31.938825 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Apr 17 23:44:31.938981 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Apr 17 23:44:31.939901 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Apr 17 23:44:31.940525 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 17 23:44:31.940971 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Apr 17 23:44:31.967698 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 17 23:44:31.981677 systemd[1]: dracut-initqueue.service: Deactivated successfully. Apr 17 23:44:31.981865 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Apr 17 23:44:32.016728 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Apr 17 23:44:32.020596 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 17 23:44:32.020852 systemd[1]: ignition-files.service: Deactivated successfully. Apr 17 23:44:32.020963 systemd[1]: Stopped ignition-files.service - Ignition (files). Apr 17 23:44:32.021318 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Apr 17 23:44:32.021415 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 17 23:44:32.048575 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Apr 17 23:44:32.057635 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Apr 17 23:44:32.063701 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Apr 17 23:44:32.071728 ignition[1142]: INFO : Ignition 2.19.0 Apr 17 23:44:32.071728 ignition[1142]: INFO : Stage: umount Apr 17 23:44:32.071728 ignition[1142]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 17 23:44:32.071728 ignition[1142]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Apr 17 23:44:32.063908 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Apr 17 23:44:32.071898 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Apr 17 23:44:32.074219 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Apr 17 23:44:32.090345 ignition[1142]: INFO : umount: umount passed Apr 17 23:44:32.090345 ignition[1142]: INFO : Ignition finished successfully Apr 17 23:44:32.106213 systemd[1]: sysroot-boot.mount: Deactivated successfully. Apr 17 23:44:32.110225 systemd[1]: ignition-mount.service: Deactivated successfully. Apr 17 23:44:32.113296 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Apr 17 23:44:32.121884 systemd[1]: initrd-cleanup.service: Deactivated successfully. Apr 17 23:44:32.122006 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Apr 17 23:44:32.129855 systemd[1]: ignition-disks.service: Deactivated successfully. Apr 17 23:44:32.129994 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Apr 17 23:44:32.135718 systemd[1]: ignition-kargs.service: Deactivated successfully. Apr 17 23:44:32.135794 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Apr 17 23:44:32.149797 systemd[1]: ignition-fetch.service: Deactivated successfully. Apr 17 23:44:32.149876 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Apr 17 23:44:32.158576 systemd[1]: Stopped target network.target - Network. Apr 17 23:44:32.163929 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Apr 17 23:44:32.164022 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Apr 17 23:44:32.174370 systemd[1]: Stopped target paths.target - Path Units. Apr 17 23:44:32.174476 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Apr 17 23:44:32.179803 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 17 23:44:32.183690 systemd[1]: Stopped target slices.target - Slice Units. Apr 17 23:44:32.183833 systemd[1]: Stopped target sockets.target - Socket Units. Apr 17 23:44:32.184358 systemd[1]: iscsid.socket: Deactivated successfully. Apr 17 23:44:32.184405 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Apr 17 23:44:32.185381 systemd[1]: iscsiuio.socket: Deactivated successfully. Apr 17 23:44:32.185416 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 17 23:44:32.185874 systemd[1]: ignition-setup.service: Deactivated successfully. Apr 17 23:44:32.185923 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Apr 17 23:44:32.186394 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Apr 17 23:44:32.186430 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Apr 17 23:44:32.187056 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Apr 17 23:44:32.187451 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Apr 17 23:44:32.218574 systemd-networkd[905]: eth0: DHCPv6 lease lost Apr 17 23:44:32.221936 systemd[1]: systemd-networkd.service: Deactivated successfully. Apr 17 23:44:32.222070 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Apr 17 23:44:32.234324 systemd[1]: systemd-resolved.service: Deactivated successfully. Apr 17 23:44:32.234435 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Apr 17 23:44:32.243075 systemd[1]: systemd-networkd.socket: Deactivated successfully. Apr 17 23:44:32.243150 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Apr 17 23:44:32.281074 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Apr 17 23:44:32.285087 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Apr 17 23:44:32.285186 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 17 23:44:32.292737 systemd[1]: systemd-sysctl.service: Deactivated successfully. Apr 17 23:44:32.292806 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Apr 17 23:44:32.299195 systemd[1]: systemd-modules-load.service: Deactivated successfully. Apr 17 23:44:32.299262 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Apr 17 23:44:32.299389 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Apr 17 23:44:32.299431 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 17 23:44:32.300508 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 17 23:44:32.329493 systemd[1]: systemd-udevd.service: Deactivated successfully. Apr 17 23:44:32.329665 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 17 23:44:32.337037 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Apr 17 23:44:32.337138 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Apr 17 23:44:32.364567 kernel: hv_netvsc 000d3ada-bb1d-000d-3ada-bb1d000d3ada eth0: Data path switched from VF: enP55636s1 Apr 17 23:44:32.365866 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Apr 17 23:44:32.365931 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Apr 17 23:44:32.374969 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Apr 17 23:44:32.375056 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Apr 17 23:44:32.384591 systemd[1]: dracut-cmdline.service: Deactivated successfully. Apr 17 23:44:32.384675 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Apr 17 23:44:32.393647 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 17 23:44:32.393716 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 17 23:44:32.414651 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Apr 17 23:44:32.421620 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Apr 17 23:44:32.421702 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 17 23:44:32.428906 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Apr 17 23:44:32.436425 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 17 23:44:32.448208 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Apr 17 23:44:32.448284 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Apr 17 23:44:32.455528 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 17 23:44:32.455591 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:32.469887 systemd[1]: network-cleanup.service: Deactivated successfully. Apr 17 23:44:32.472773 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Apr 17 23:44:32.479074 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Apr 17 23:44:32.479192 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Apr 17 23:44:32.592303 systemd[1]: sysroot-boot.service: Deactivated successfully. Apr 17 23:44:32.592434 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Apr 17 23:44:32.594067 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Apr 17 23:44:32.594541 systemd[1]: initrd-setup-root.service: Deactivated successfully. Apr 17 23:44:32.594593 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Apr 17 23:44:32.609740 systemd[1]: Starting initrd-switch-root.service - Switch Root... Apr 17 23:44:33.089613 systemd[1]: Switching root. Apr 17 23:44:33.123824 systemd-journald[177]: Journal stopped Apr 17 23:44:37.875163 systemd-journald[177]: Received SIGTERM from PID 1 (systemd). Apr 17 23:44:37.875196 kernel: SELinux: policy capability network_peer_controls=1 Apr 17 23:44:37.875214 kernel: SELinux: policy capability open_perms=1 Apr 17 23:44:37.875227 kernel: SELinux: policy capability extended_socket_class=1 Apr 17 23:44:37.875236 kernel: SELinux: policy capability always_check_network=0 Apr 17 23:44:37.875248 kernel: SELinux: policy capability cgroup_seclabel=1 Apr 17 23:44:37.875258 kernel: SELinux: policy capability nnp_nosuid_transition=1 Apr 17 23:44:37.875270 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Apr 17 23:44:37.875282 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Apr 17 23:44:37.875291 kernel: audit: type=1403 audit(1776469474.776:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Apr 17 23:44:37.875304 systemd[1]: Successfully loaded SELinux policy in 142.515ms. Apr 17 23:44:37.875315 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 9.521ms. Apr 17 23:44:37.875330 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 17 23:44:37.875340 systemd[1]: Detected virtualization microsoft. Apr 17 23:44:37.875357 systemd[1]: Detected architecture x86-64. Apr 17 23:44:37.875367 systemd[1]: Detected first boot. Apr 17 23:44:37.875381 systemd[1]: Hostname set to . Apr 17 23:44:37.875391 systemd[1]: Initializing machine ID from random generator. Apr 17 23:44:37.875405 zram_generator::config[1202]: No configuration found. Apr 17 23:44:37.875418 systemd[1]: Populated /etc with preset unit settings. Apr 17 23:44:37.875432 systemd[1]: Queued start job for default target multi-user.target. Apr 17 23:44:37.875444 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Apr 17 23:44:37.875459 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Apr 17 23:44:37.875470 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Apr 17 23:44:37.879389 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Apr 17 23:44:37.879418 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Apr 17 23:44:37.879441 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Apr 17 23:44:37.879451 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Apr 17 23:44:37.879467 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Apr 17 23:44:37.879494 systemd[1]: Created slice user.slice - User and Session Slice. Apr 17 23:44:37.879510 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 17 23:44:37.879525 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 17 23:44:37.879539 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Apr 17 23:44:37.879554 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Apr 17 23:44:37.879573 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Apr 17 23:44:37.879588 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 17 23:44:37.879603 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Apr 17 23:44:37.879618 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 17 23:44:37.879632 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Apr 17 23:44:37.879646 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 17 23:44:37.879665 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 17 23:44:37.879681 systemd[1]: Reached target slices.target - Slice Units. Apr 17 23:44:37.879699 systemd[1]: Reached target swap.target - Swaps. Apr 17 23:44:37.879717 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Apr 17 23:44:37.879733 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Apr 17 23:44:37.879749 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Apr 17 23:44:37.879764 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Apr 17 23:44:37.879780 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 17 23:44:37.879797 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 17 23:44:37.879811 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 17 23:44:37.879830 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Apr 17 23:44:37.879844 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Apr 17 23:44:37.879860 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Apr 17 23:44:37.879876 systemd[1]: Mounting media.mount - External Media Directory... Apr 17 23:44:37.879892 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 23:44:37.879911 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Apr 17 23:44:37.879927 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Apr 17 23:44:37.879946 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Apr 17 23:44:37.879969 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Apr 17 23:44:37.879986 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 17 23:44:37.880002 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 17 23:44:37.880019 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Apr 17 23:44:37.880037 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 17 23:44:37.880119 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 17 23:44:37.880137 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 17 23:44:37.880154 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Apr 17 23:44:37.880171 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 17 23:44:37.880189 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Apr 17 23:44:37.880205 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Apr 17 23:44:37.880222 systemd[1]: systemd-journald.service: (This warning is only shown for the first unit using IP firewalling.) Apr 17 23:44:37.880239 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 17 23:44:37.880262 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 17 23:44:37.880279 kernel: fuse: init (API version 7.39) Apr 17 23:44:37.880294 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Apr 17 23:44:37.880310 kernel: loop: module loaded Apr 17 23:44:37.880326 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Apr 17 23:44:37.880385 systemd-journald[1316]: Collecting audit messages is disabled. Apr 17 23:44:37.880427 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 17 23:44:37.880447 systemd-journald[1316]: Journal started Apr 17 23:44:37.880503 systemd-journald[1316]: Runtime Journal (/run/log/journal/ed549fbb01794e82bf13526849e02ee9) is 8.0M, max 158.7M, 150.7M free. Apr 17 23:44:37.896730 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 23:44:37.905513 systemd[1]: Started systemd-journald.service - Journal Service. Apr 17 23:44:37.909755 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Apr 17 23:44:37.913131 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Apr 17 23:44:37.918014 systemd[1]: Mounted media.mount - External Media Directory. Apr 17 23:44:37.926522 kernel: ACPI: bus type drm_connector registered Apr 17 23:44:37.923940 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Apr 17 23:44:37.927419 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Apr 17 23:44:37.930961 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Apr 17 23:44:37.934195 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Apr 17 23:44:37.938048 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 17 23:44:37.942000 systemd[1]: modprobe@configfs.service: Deactivated successfully. Apr 17 23:44:37.942222 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Apr 17 23:44:37.945831 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 17 23:44:37.946034 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 17 23:44:37.950676 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 17 23:44:37.950896 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 17 23:44:37.954602 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 17 23:44:37.954836 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 17 23:44:37.958895 systemd[1]: modprobe@fuse.service: Deactivated successfully. Apr 17 23:44:37.959132 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Apr 17 23:44:37.963205 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 17 23:44:37.963437 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 17 23:44:37.967417 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 17 23:44:37.971283 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Apr 17 23:44:37.976177 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Apr 17 23:44:37.996228 systemd[1]: Reached target network-pre.target - Preparation for Network. Apr 17 23:44:38.005653 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Apr 17 23:44:38.014627 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Apr 17 23:44:38.018255 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Apr 17 23:44:38.030733 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Apr 17 23:44:38.044690 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Apr 17 23:44:38.048244 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 17 23:44:38.054701 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Apr 17 23:44:38.063673 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 17 23:44:38.070678 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 17 23:44:38.080650 systemd-journald[1316]: Time spent on flushing to /var/log/journal/ed549fbb01794e82bf13526849e02ee9 is 21.635ms for 944 entries. Apr 17 23:44:38.080650 systemd-journald[1316]: System Journal (/var/log/journal/ed549fbb01794e82bf13526849e02ee9) is 8.0M, max 2.6G, 2.6G free. Apr 17 23:44:38.125729 systemd-journald[1316]: Received client request to flush runtime journal. Apr 17 23:44:38.076636 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 17 23:44:38.088819 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 17 23:44:38.093915 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Apr 17 23:44:38.099118 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Apr 17 23:44:38.106335 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Apr 17 23:44:38.113354 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Apr 17 23:44:38.128017 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Apr 17 23:44:38.133677 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Apr 17 23:44:38.147335 udevadm[1370]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Apr 17 23:44:38.230147 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 17 23:44:38.243797 systemd-tmpfiles[1362]: ACLs are not supported, ignoring. Apr 17 23:44:38.243824 systemd-tmpfiles[1362]: ACLs are not supported, ignoring. Apr 17 23:44:38.251414 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 17 23:44:38.265766 systemd[1]: Starting systemd-sysusers.service - Create System Users... Apr 17 23:44:38.371209 systemd[1]: Finished systemd-sysusers.service - Create System Users. Apr 17 23:44:38.383663 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 17 23:44:38.399608 systemd-tmpfiles[1383]: ACLs are not supported, ignoring. Apr 17 23:44:38.399632 systemd-tmpfiles[1383]: ACLs are not supported, ignoring. Apr 17 23:44:38.404546 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 17 23:44:39.379547 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Apr 17 23:44:39.389665 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 17 23:44:39.414644 systemd-udevd[1389]: Using default interface naming scheme 'v255'. Apr 17 23:44:39.577413 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 17 23:44:39.593669 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 17 23:44:39.644372 systemd[1]: Found device dev-ttyS0.device - /dev/ttyS0. Apr 17 23:44:39.665681 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Apr 17 23:44:39.767947 kernel: mousedev: PS/2 mouse device common for all mice Apr 17 23:44:39.761856 systemd[1]: Started systemd-userdbd.service - User Database Manager. Apr 17 23:44:39.817511 kernel: hv_vmbus: registering driver hv_balloon Apr 17 23:44:39.833197 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#270 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 Apr 17 23:44:39.833439 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Apr 17 23:44:39.858544 kernel: hv_vmbus: registering driver hyperv_fb Apr 17 23:44:39.874576 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Apr 17 23:44:39.882514 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Apr 17 23:44:39.886554 kernel: Console: switching to colour dummy device 80x25 Apr 17 23:44:39.892693 kernel: Console: switching to colour frame buffer device 128x48 Apr 17 23:44:39.897916 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 23:44:39.905906 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 17 23:44:39.906217 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:39.934215 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 23:44:40.065185 systemd-networkd[1395]: lo: Link UP Apr 17 23:44:40.067705 systemd-networkd[1395]: lo: Gained carrier Apr 17 23:44:40.070285 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 17 23:44:40.070640 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:40.071150 systemd-networkd[1395]: Enumeration completed Apr 17 23:44:40.071598 systemd-networkd[1395]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 17 23:44:40.071689 systemd-networkd[1395]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 17 23:44:40.075166 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 17 23:44:40.086831 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Apr 17 23:44:40.103594 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 17 23:44:40.157524 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 32 scanned by (udev-worker) (1403) Apr 17 23:44:40.161513 kernel: mlx5_core d954:00:02.0 enP55636s1: Link up Apr 17 23:44:40.179517 kernel: hv_netvsc 000d3ada-bb1d-000d-3ada-bb1d000d3ada eth0: Data path switched to VF: enP55636s1 Apr 17 23:44:40.181434 systemd-networkd[1395]: enP55636s1: Link UP Apr 17 23:44:40.181728 systemd-networkd[1395]: eth0: Link UP Apr 17 23:44:40.181743 systemd-networkd[1395]: eth0: Gained carrier Apr 17 23:44:40.181767 systemd-networkd[1395]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 17 23:44:40.185820 systemd-networkd[1395]: enP55636s1: Gained carrier Apr 17 23:44:40.223567 systemd-networkd[1395]: eth0: DHCPv4 address 10.0.0.10/24, gateway 10.0.0.1 acquired from 168.63.129.16 Apr 17 23:44:40.291972 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Apr 17 23:44:40.296063 kernel: kvm_intel: Using Hyper-V Enlightened VMCS Apr 17 23:44:40.340123 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Apr 17 23:44:40.350721 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Apr 17 23:44:40.441682 lvm[1483]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 17 23:44:40.481825 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Apr 17 23:44:40.486914 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 17 23:44:40.495745 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Apr 17 23:44:40.503401 lvm[1486]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 17 23:44:40.529357 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Apr 17 23:44:40.533737 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Apr 17 23:44:40.538121 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Apr 17 23:44:40.538163 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 17 23:44:40.541808 systemd[1]: Reached target machines.target - Containers. Apr 17 23:44:40.545909 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Apr 17 23:44:40.555665 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Apr 17 23:44:40.562722 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Apr 17 23:44:40.566458 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 17 23:44:40.569919 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Apr 17 23:44:40.581720 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Apr 17 23:44:40.586699 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Apr 17 23:44:40.591130 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 17 23:44:40.598675 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Apr 17 23:44:40.695648 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Apr 17 23:44:40.696699 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Apr 17 23:44:40.719816 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Apr 17 23:44:40.720499 kernel: loop0: detected capacity change from 0 to 31056 Apr 17 23:44:41.156612 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Apr 17 23:44:41.216505 kernel: loop1: detected capacity change from 0 to 142488 Apr 17 23:44:41.427634 systemd-networkd[1395]: eth0: Gained IPv6LL Apr 17 23:44:41.430879 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Apr 17 23:44:41.658614 kernel: loop2: detected capacity change from 0 to 140768 Apr 17 23:44:42.109507 kernel: loop3: detected capacity change from 0 to 228704 Apr 17 23:44:42.142506 kernel: loop4: detected capacity change from 0 to 31056 Apr 17 23:44:42.154504 kernel: loop5: detected capacity change from 0 to 142488 Apr 17 23:44:42.174535 kernel: loop6: detected capacity change from 0 to 140768 Apr 17 23:44:42.192503 kernel: loop7: detected capacity change from 0 to 228704 Apr 17 23:44:42.204246 (sd-merge)[1514]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Apr 17 23:44:42.204886 (sd-merge)[1514]: Merged extensions into '/usr'. Apr 17 23:44:42.208516 systemd[1]: Reloading requested from client PID 1496 ('systemd-sysext') (unit systemd-sysext.service)... Apr 17 23:44:42.208534 systemd[1]: Reloading... Apr 17 23:44:42.267552 zram_generator::config[1538]: No configuration found. Apr 17 23:44:42.434718 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 17 23:44:42.506638 systemd[1]: Reloading finished in 297 ms. Apr 17 23:44:42.520939 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Apr 17 23:44:42.534651 systemd[1]: Starting ensure-sysext.service... Apr 17 23:44:42.541655 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 17 23:44:42.549453 systemd[1]: Reloading requested from client PID 1605 ('systemctl') (unit ensure-sysext.service)... Apr 17 23:44:42.549633 systemd[1]: Reloading... Apr 17 23:44:42.569657 systemd-tmpfiles[1606]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Apr 17 23:44:42.570161 systemd-tmpfiles[1606]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Apr 17 23:44:42.571598 systemd-tmpfiles[1606]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Apr 17 23:44:42.572043 systemd-tmpfiles[1606]: ACLs are not supported, ignoring. Apr 17 23:44:42.572137 systemd-tmpfiles[1606]: ACLs are not supported, ignoring. Apr 17 23:44:42.577339 systemd-tmpfiles[1606]: Detected autofs mount point /boot during canonicalization of boot. Apr 17 23:44:42.577353 systemd-tmpfiles[1606]: Skipping /boot Apr 17 23:44:42.589638 systemd-tmpfiles[1606]: Detected autofs mount point /boot during canonicalization of boot. Apr 17 23:44:42.589655 systemd-tmpfiles[1606]: Skipping /boot Apr 17 23:44:42.679542 zram_generator::config[1636]: No configuration found. Apr 17 23:44:42.833810 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 17 23:44:42.911619 systemd[1]: Reloading finished in 361 ms. Apr 17 23:44:42.935194 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 17 23:44:42.949021 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 23:44:42.950572 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 17 23:44:42.963139 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Apr 17 23:44:42.967364 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 17 23:44:42.970764 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 17 23:44:42.982831 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 17 23:44:42.999897 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 17 23:44:43.003421 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 17 23:44:43.006359 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Apr 17 23:44:43.016799 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 17 23:44:43.024630 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Apr 17 23:44:43.028821 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 23:44:43.033307 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 17 23:44:43.033573 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 17 23:44:43.042826 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 17 23:44:43.043068 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 17 23:44:43.048367 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 17 23:44:43.051744 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 17 23:44:43.070000 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 23:44:43.070371 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 17 23:44:43.075641 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 17 23:44:43.085669 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 17 23:44:43.103229 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 17 23:44:43.108069 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 17 23:44:43.108265 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 23:44:43.112163 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Apr 17 23:44:43.118442 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 17 23:44:43.118682 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 17 23:44:43.128037 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 17 23:44:43.128268 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 17 23:44:43.146143 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 17 23:44:43.146380 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 17 23:44:43.165110 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 23:44:43.166554 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 17 23:44:43.177668 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 17 23:44:43.194682 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 17 23:44:43.203974 augenrules[1754]: No rules Apr 17 23:44:43.204642 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 17 23:44:43.208938 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 17 23:44:43.209040 systemd[1]: Reached target time-set.target - System Time Set. Apr 17 23:44:43.217172 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 17 23:44:43.219393 systemd[1]: Finished ensure-sysext.service. Apr 17 23:44:43.227248 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 17 23:44:43.227549 systemd-resolved[1722]: Positive Trust Anchors: Apr 17 23:44:43.227558 systemd-resolved[1722]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 17 23:44:43.227596 systemd-resolved[1722]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 17 23:44:43.231376 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Apr 17 23:44:43.235875 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 17 23:44:43.236077 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 17 23:44:43.240073 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 17 23:44:43.240273 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 17 23:44:43.244186 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 17 23:44:43.246861 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 17 23:44:43.248364 systemd-resolved[1722]: Using system hostname 'ci-4081.3.6-n-7b570e9a3c'. Apr 17 23:44:43.251504 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 17 23:44:43.260897 systemd[1]: Reached target network.target - Network. Apr 17 23:44:43.264105 systemd[1]: Reached target network-online.target - Network is Online. Apr 17 23:44:43.267658 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 17 23:44:43.271473 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 17 23:44:43.271579 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 17 23:44:43.562119 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Apr 17 23:44:43.566710 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Apr 17 23:44:47.190445 ldconfig[1492]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Apr 17 23:44:47.202242 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Apr 17 23:44:47.214692 systemd[1]: Starting systemd-update-done.service - Update is Completed... Apr 17 23:44:47.227803 systemd[1]: Finished systemd-update-done.service - Update is Completed. Apr 17 23:44:47.232071 systemd[1]: Reached target sysinit.target - System Initialization. Apr 17 23:44:47.236585 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Apr 17 23:44:47.240647 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Apr 17 23:44:47.244850 systemd[1]: Started logrotate.timer - Daily rotation of log files. Apr 17 23:44:47.248387 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Apr 17 23:44:47.252356 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Apr 17 23:44:47.256502 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Apr 17 23:44:47.256558 systemd[1]: Reached target paths.target - Path Units. Apr 17 23:44:47.260229 systemd[1]: Reached target timers.target - Timer Units. Apr 17 23:44:47.263661 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Apr 17 23:44:47.268468 systemd[1]: Starting docker.socket - Docker Socket for the API... Apr 17 23:44:47.272887 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Apr 17 23:44:47.277442 systemd[1]: Listening on docker.socket - Docker Socket for the API. Apr 17 23:44:47.281018 systemd[1]: Reached target sockets.target - Socket Units. Apr 17 23:44:47.284121 systemd[1]: Reached target basic.target - Basic System. Apr 17 23:44:47.287352 systemd[1]: System is tainted: cgroupsv1 Apr 17 23:44:47.287425 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Apr 17 23:44:47.287467 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Apr 17 23:44:47.291607 systemd[1]: Starting chronyd.service - NTP client/server... Apr 17 23:44:47.303642 systemd[1]: Starting containerd.service - containerd container runtime... Apr 17 23:44:47.321662 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Apr 17 23:44:47.328667 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Apr 17 23:44:47.344696 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Apr 17 23:44:47.350935 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Apr 17 23:44:47.354323 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Apr 17 23:44:47.354388 systemd[1]: hv_fcopy_daemon.service - Hyper-V FCOPY daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_fcopy). Apr 17 23:44:47.361715 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Apr 17 23:44:47.365363 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Apr 17 23:44:47.367219 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 23:44:47.369335 (chronyd)[1779]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS Apr 17 23:44:47.374642 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Apr 17 23:44:47.395685 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Apr 17 23:44:47.400449 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Apr 17 23:44:47.411767 chronyd[1797]: chronyd version 4.5 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) Apr 17 23:44:47.412838 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Apr 17 23:44:47.419729 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Apr 17 23:44:47.432883 chronyd[1797]: Timezone right/UTC failed leap second check, ignoring Apr 17 23:44:47.446829 jq[1786]: false Apr 17 23:44:47.434355 systemd[1]: Starting systemd-logind.service - User Login Management... Apr 17 23:44:47.433250 chronyd[1797]: Loaded seccomp filter (level 2) Apr 17 23:44:47.442200 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Apr 17 23:44:47.451742 systemd[1]: Starting update-engine.service - Update Engine... Apr 17 23:44:47.461609 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Apr 17 23:44:47.470137 jq[1810]: true Apr 17 23:44:47.477094 systemd[1]: Started chronyd.service - NTP client/server. Apr 17 23:44:47.490534 KVP[1788]: KVP starting; pid is:1788 Apr 17 23:44:47.497725 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Apr 17 23:44:47.505835 extend-filesystems[1787]: Found loop4 Apr 17 23:44:47.505835 extend-filesystems[1787]: Found loop5 Apr 17 23:44:47.505835 extend-filesystems[1787]: Found loop6 Apr 17 23:44:47.505835 extend-filesystems[1787]: Found loop7 Apr 17 23:44:47.505835 extend-filesystems[1787]: Found sda Apr 17 23:44:47.505835 extend-filesystems[1787]: Found sda1 Apr 17 23:44:47.505835 extend-filesystems[1787]: Found sda2 Apr 17 23:44:47.505835 extend-filesystems[1787]: Found sda3 Apr 17 23:44:47.505835 extend-filesystems[1787]: Found usr Apr 17 23:44:47.505835 extend-filesystems[1787]: Found sda4 Apr 17 23:44:47.505835 extend-filesystems[1787]: Found sda6 Apr 17 23:44:47.505835 extend-filesystems[1787]: Found sda7 Apr 17 23:44:47.505835 extend-filesystems[1787]: Found sda9 Apr 17 23:44:47.505835 extend-filesystems[1787]: Checking size of /dev/sda9 Apr 17 23:44:47.700455 kernel: hv_utils: KVP IC version 4.0 Apr 17 23:44:47.498036 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Apr 17 23:44:47.700997 update_engine[1805]: I20260417 23:44:47.629750 1805 main.cc:92] Flatcar Update Engine starting Apr 17 23:44:47.700997 update_engine[1805]: I20260417 23:44:47.668790 1805 update_check_scheduler.cc:74] Next update check in 4m0s Apr 17 23:44:47.554722 KVP[1788]: KVP LIC Version: 3.1 Apr 17 23:44:47.701436 extend-filesystems[1787]: Old size kept for /dev/sda9 Apr 17 23:44:47.701436 extend-filesystems[1787]: Found sr0 Apr 17 23:44:47.501426 systemd[1]: motdgen.service: Deactivated successfully. Apr 17 23:44:47.642462 dbus-daemon[1783]: [system] SELinux support is enabled Apr 17 23:44:47.501777 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Apr 17 23:44:47.695923 dbus-daemon[1783]: [system] Successfully activated service 'org.freedesktop.systemd1' Apr 17 23:44:47.513651 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Apr 17 23:44:47.734191 jq[1824]: true Apr 17 23:44:47.513870 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Apr 17 23:44:47.735746 tar[1821]: linux-amd64/LICENSE Apr 17 23:44:47.735746 tar[1821]: linux-amd64/helm Apr 17 23:44:47.538613 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Apr 17 23:44:47.585856 (ntainerd)[1834]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Apr 17 23:44:47.598664 systemd[1]: extend-filesystems.service: Deactivated successfully. Apr 17 23:44:47.599006 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Apr 17 23:44:47.643069 systemd[1]: Started dbus.service - D-Bus System Message Bus. Apr 17 23:44:47.659338 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Apr 17 23:44:47.659378 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Apr 17 23:44:47.666458 systemd-logind[1800]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Apr 17 23:44:47.667300 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Apr 17 23:44:47.667323 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Apr 17 23:44:47.668695 systemd-logind[1800]: New seat seat0. Apr 17 23:44:47.682347 systemd[1]: Started systemd-logind.service - User Login Management. Apr 17 23:44:47.695688 systemd[1]: Started update-engine.service - Update Engine. Apr 17 23:44:47.712699 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Apr 17 23:44:47.721353 systemd[1]: Started locksmithd.service - Cluster reboot manager. Apr 17 23:44:47.799744 bash[1870]: Updated "/home/core/.ssh/authorized_keys" Apr 17 23:44:47.780953 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Apr 17 23:44:47.787211 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Apr 17 23:44:47.842863 coreos-metadata[1782]: Apr 17 23:44:47.842 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Apr 17 23:44:47.845880 coreos-metadata[1782]: Apr 17 23:44:47.845 INFO Fetch successful Apr 17 23:44:47.847734 coreos-metadata[1782]: Apr 17 23:44:47.847 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Apr 17 23:44:47.854021 coreos-metadata[1782]: Apr 17 23:44:47.853 INFO Fetch successful Apr 17 23:44:47.854021 coreos-metadata[1782]: Apr 17 23:44:47.853 INFO Fetching http://168.63.129.16/machine/fc7ed9b6-9dee-45ab-9afc-0a81a3d18b48/10fa169a%2D867d%2D4567%2Da0c6%2Da0c8d871db37.%5Fci%2D4081.3.6%2Dn%2D7b570e9a3c?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Apr 17 23:44:47.857612 coreos-metadata[1782]: Apr 17 23:44:47.857 INFO Fetch successful Apr 17 23:44:47.858065 coreos-metadata[1782]: Apr 17 23:44:47.858 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Apr 17 23:44:47.871958 coreos-metadata[1782]: Apr 17 23:44:47.869 INFO Fetch successful Apr 17 23:44:47.946929 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Apr 17 23:44:47.962539 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 32 scanned by (udev-worker) (1862) Apr 17 23:44:47.955265 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Apr 17 23:44:48.171536 sshd_keygen[1829]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Apr 17 23:44:48.195214 locksmithd[1872]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Apr 17 23:44:48.231799 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Apr 17 23:44:48.244877 systemd[1]: Starting issuegen.service - Generate /run/issue... Apr 17 23:44:48.258680 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Apr 17 23:44:48.290850 systemd[1]: issuegen.service: Deactivated successfully. Apr 17 23:44:48.296034 systemd[1]: Finished issuegen.service - Generate /run/issue. Apr 17 23:44:48.305055 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Apr 17 23:44:48.319703 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Apr 17 23:44:48.328690 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Apr 17 23:44:48.343856 systemd[1]: Started getty@tty1.service - Getty on tty1. Apr 17 23:44:48.351986 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Apr 17 23:44:48.359535 systemd[1]: Reached target getty.target - Login Prompts. Apr 17 23:44:48.612913 tar[1821]: linux-amd64/README.md Apr 17 23:44:48.631891 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Apr 17 23:44:48.967790 containerd[1834]: time="2026-04-17T23:44:48.967655200Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Apr 17 23:44:49.002456 containerd[1834]: time="2026-04-17T23:44:49.002404500Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Apr 17 23:44:49.008585 containerd[1834]: time="2026-04-17T23:44:49.008538600Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Apr 17 23:44:49.010143 containerd[1834]: time="2026-04-17T23:44:49.010118700Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Apr 17 23:44:49.010249 containerd[1834]: time="2026-04-17T23:44:49.010234400Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Apr 17 23:44:49.010548 containerd[1834]: time="2026-04-17T23:44:49.010524600Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Apr 17 23:44:49.010636 containerd[1834]: time="2026-04-17T23:44:49.010622500Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Apr 17 23:44:49.012616 containerd[1834]: time="2026-04-17T23:44:49.012573500Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Apr 17 23:44:49.012616 containerd[1834]: time="2026-04-17T23:44:49.012604000Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Apr 17 23:44:49.012902 containerd[1834]: time="2026-04-17T23:44:49.012870800Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 17 23:44:49.012902 containerd[1834]: time="2026-04-17T23:44:49.012895700Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Apr 17 23:44:49.012990 containerd[1834]: time="2026-04-17T23:44:49.012915000Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Apr 17 23:44:49.012990 containerd[1834]: time="2026-04-17T23:44:49.012937000Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Apr 17 23:44:49.013077 containerd[1834]: time="2026-04-17T23:44:49.013056000Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Apr 17 23:44:49.013313 containerd[1834]: time="2026-04-17T23:44:49.013279400Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Apr 17 23:44:49.013700 containerd[1834]: time="2026-04-17T23:44:49.013539300Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 17 23:44:49.013700 containerd[1834]: time="2026-04-17T23:44:49.013565700Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Apr 17 23:44:49.013700 containerd[1834]: time="2026-04-17T23:44:49.013667000Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Apr 17 23:44:49.013830 containerd[1834]: time="2026-04-17T23:44:49.013717500Z" level=info msg="metadata content store policy set" policy=shared Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.030139600Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.030209000Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.030253900Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.030277400Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.030297400Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.030457900Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.030934900Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.031085400Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.031127400Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.031149400Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.031169100Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.031187100Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.031211900Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Apr 17 23:44:49.032629 containerd[1834]: time="2026-04-17T23:44:49.031237400Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031260300Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031278200Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031295700Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031314900Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031342900Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031365000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031383000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031401800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031420000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031437000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031462600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031505900Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031526500Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033155 containerd[1834]: time="2026-04-17T23:44:49.031547400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031570200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031588300Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031607800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031632300Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031662900Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031681600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031697700Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031747000Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031770300Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031786800Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031803900Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031817400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031833600Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Apr 17 23:44:49.033716 containerd[1834]: time="2026-04-17T23:44:49.031846300Z" level=info msg="NRI interface is disabled by configuration." Apr 17 23:44:49.034189 containerd[1834]: time="2026-04-17T23:44:49.031860200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Apr 17 23:44:49.034233 containerd[1834]: time="2026-04-17T23:44:49.032239500Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Apr 17 23:44:49.034233 containerd[1834]: time="2026-04-17T23:44:49.032317800Z" level=info msg="Connect containerd service" Apr 17 23:44:49.034233 containerd[1834]: time="2026-04-17T23:44:49.032369500Z" level=info msg="using legacy CRI server" Apr 17 23:44:49.034233 containerd[1834]: time="2026-04-17T23:44:49.032378700Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Apr 17 23:44:49.037512 containerd[1834]: time="2026-04-17T23:44:49.037459300Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Apr 17 23:44:49.038888 containerd[1834]: time="2026-04-17T23:44:49.038842200Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 17 23:44:49.039220 containerd[1834]: time="2026-04-17T23:44:49.039167900Z" level=info msg="Start subscribing containerd event" Apr 17 23:44:49.039371 containerd[1834]: time="2026-04-17T23:44:49.039249700Z" level=info msg="Start recovering state" Apr 17 23:44:49.039371 containerd[1834]: time="2026-04-17T23:44:49.039340600Z" level=info msg="Start event monitor" Apr 17 23:44:49.039463 containerd[1834]: time="2026-04-17T23:44:49.039370400Z" level=info msg="Start snapshots syncer" Apr 17 23:44:49.039463 containerd[1834]: time="2026-04-17T23:44:49.039384800Z" level=info msg="Start cni network conf syncer for default" Apr 17 23:44:49.039463 containerd[1834]: time="2026-04-17T23:44:49.039397400Z" level=info msg="Start streaming server" Apr 17 23:44:49.043516 containerd[1834]: time="2026-04-17T23:44:49.040760300Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Apr 17 23:44:49.043516 containerd[1834]: time="2026-04-17T23:44:49.040829100Z" level=info msg=serving... address=/run/containerd/containerd.sock Apr 17 23:44:49.043516 containerd[1834]: time="2026-04-17T23:44:49.040905400Z" level=info msg="containerd successfully booted in 0.074890s" Apr 17 23:44:49.041591 systemd[1]: Started containerd.service - containerd container runtime. Apr 17 23:44:49.131591 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 23:44:49.135771 (kubelet)[1965]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 17 23:44:49.138271 systemd[1]: Reached target multi-user.target - Multi-User System. Apr 17 23:44:49.142346 systemd[1]: Startup finished in 13.376s (kernel) + 14.506s (userspace) = 27.882s. Apr 17 23:44:49.655562 login[1942]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Apr 17 23:44:49.658601 login[1943]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Apr 17 23:44:49.674351 systemd-logind[1800]: New session 2 of user core. Apr 17 23:44:49.677031 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Apr 17 23:44:49.685784 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Apr 17 23:44:49.692035 systemd-logind[1800]: New session 1 of user core. Apr 17 23:44:49.719079 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Apr 17 23:44:49.729425 systemd[1]: Starting user@500.service - User Manager for UID 500... Apr 17 23:44:49.743839 (systemd)[1979]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Apr 17 23:44:49.771164 kubelet[1965]: E0417 23:44:49.771103 1965 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 17 23:44:49.774522 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 17 23:44:49.775007 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 17 23:44:49.879438 systemd[1979]: Queued start job for default target default.target. Apr 17 23:44:49.880343 systemd[1979]: Created slice app.slice - User Application Slice. Apr 17 23:44:49.880377 systemd[1979]: Reached target paths.target - Paths. Apr 17 23:44:49.880395 systemd[1979]: Reached target timers.target - Timers. Apr 17 23:44:49.890587 systemd[1979]: Starting dbus.socket - D-Bus User Message Bus Socket... Apr 17 23:44:49.898183 systemd[1979]: Listening on dbus.socket - D-Bus User Message Bus Socket. Apr 17 23:44:49.898253 systemd[1979]: Reached target sockets.target - Sockets. Apr 17 23:44:49.898273 systemd[1979]: Reached target basic.target - Basic System. Apr 17 23:44:49.898324 systemd[1979]: Reached target default.target - Main User Target. Apr 17 23:44:49.898360 systemd[1979]: Startup finished in 144ms. Apr 17 23:44:49.898869 systemd[1]: Started user@500.service - User Manager for UID 500. Apr 17 23:44:49.900773 systemd[1]: Started session-1.scope - Session 1 of User core. Apr 17 23:44:49.902393 systemd[1]: Started session-2.scope - Session 2 of User core. Apr 17 23:44:50.285793 waagent[1938]: 2026-04-17T23:44:50.285686Z INFO Daemon Daemon Azure Linux Agent Version: 2.9.1.1 Apr 17 23:44:50.289573 waagent[1938]: 2026-04-17T23:44:50.289506Z INFO Daemon Daemon OS: flatcar 4081.3.6 Apr 17 23:44:50.292469 waagent[1938]: 2026-04-17T23:44:50.292398Z INFO Daemon Daemon Python: 3.11.9 Apr 17 23:44:50.295291 waagent[1938]: 2026-04-17T23:44:50.295221Z INFO Daemon Daemon Run daemon Apr 17 23:44:50.297715 waagent[1938]: 2026-04-17T23:44:50.297667Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4081.3.6' Apr 17 23:44:50.302975 waagent[1938]: 2026-04-17T23:44:50.302765Z INFO Daemon Daemon Using waagent for provisioning Apr 17 23:44:50.306218 waagent[1938]: 2026-04-17T23:44:50.306167Z INFO Daemon Daemon Activate resource disk Apr 17 23:44:50.309038 waagent[1938]: 2026-04-17T23:44:50.308988Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Apr 17 23:44:50.316908 waagent[1938]: 2026-04-17T23:44:50.316851Z INFO Daemon Daemon Found device: None Apr 17 23:44:50.355420 waagent[1938]: 2026-04-17T23:44:50.317155Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Apr 17 23:44:50.355420 waagent[1938]: 2026-04-17T23:44:50.317270Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Apr 17 23:44:50.355420 waagent[1938]: 2026-04-17T23:44:50.319951Z INFO Daemon Daemon Clean protocol and wireserver endpoint Apr 17 23:44:50.355420 waagent[1938]: 2026-04-17T23:44:50.320964Z INFO Daemon Daemon Running default provisioning handler Apr 17 23:44:50.355420 waagent[1938]: 2026-04-17T23:44:50.331130Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Apr 17 23:44:50.355420 waagent[1938]: 2026-04-17T23:44:50.333561Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Apr 17 23:44:50.355420 waagent[1938]: 2026-04-17T23:44:50.334125Z INFO Daemon Daemon cloud-init is enabled: False Apr 17 23:44:50.355420 waagent[1938]: 2026-04-17T23:44:50.335147Z INFO Daemon Daemon Copying ovf-env.xml Apr 17 23:44:50.416504 waagent[1938]: 2026-04-17T23:44:50.413517Z INFO Daemon Daemon Successfully mounted dvd Apr 17 23:44:50.429455 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Apr 17 23:44:50.431816 waagent[1938]: 2026-04-17T23:44:50.431743Z INFO Daemon Daemon Detect protocol endpoint Apr 17 23:44:50.435014 waagent[1938]: 2026-04-17T23:44:50.434867Z INFO Daemon Daemon Clean protocol and wireserver endpoint Apr 17 23:44:50.450169 waagent[1938]: 2026-04-17T23:44:50.435101Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Apr 17 23:44:50.450169 waagent[1938]: 2026-04-17T23:44:50.436077Z INFO Daemon Daemon Test for route to 168.63.129.16 Apr 17 23:44:50.450169 waagent[1938]: 2026-04-17T23:44:50.437237Z INFO Daemon Daemon Route to 168.63.129.16 exists Apr 17 23:44:50.450169 waagent[1938]: 2026-04-17T23:44:50.438239Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Apr 17 23:44:50.463020 waagent[1938]: 2026-04-17T23:44:50.462967Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Apr 17 23:44:50.472465 waagent[1938]: 2026-04-17T23:44:50.463399Z INFO Daemon Daemon Wire protocol version:2012-11-30 Apr 17 23:44:50.472465 waagent[1938]: 2026-04-17T23:44:50.464413Z INFO Daemon Daemon Server preferred version:2015-04-05 Apr 17 23:44:50.583155 waagent[1938]: 2026-04-17T23:44:50.582995Z INFO Daemon Daemon Initializing goal state during protocol detection Apr 17 23:44:50.587281 waagent[1938]: 2026-04-17T23:44:50.587131Z INFO Daemon Daemon Forcing an update of the goal state. Apr 17 23:44:50.592504 waagent[1938]: 2026-04-17T23:44:50.592435Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Apr 17 23:44:50.605590 waagent[1938]: 2026-04-17T23:44:50.605535Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.181 Apr 17 23:44:50.624332 waagent[1938]: 2026-04-17T23:44:50.606286Z INFO Daemon Apr 17 23:44:50.624332 waagent[1938]: 2026-04-17T23:44:50.607016Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 87ee8a14-31a8-433d-9333-9b41f7f50369 eTag: 17100985401297773409 source: Fabric] Apr 17 23:44:50.624332 waagent[1938]: 2026-04-17T23:44:50.608448Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Apr 17 23:44:50.624332 waagent[1938]: 2026-04-17T23:44:50.609250Z INFO Daemon Apr 17 23:44:50.624332 waagent[1938]: 2026-04-17T23:44:50.610347Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Apr 17 23:44:50.626951 waagent[1938]: 2026-04-17T23:44:50.626912Z INFO Daemon Daemon Downloading artifacts profile blob Apr 17 23:44:50.686491 waagent[1938]: 2026-04-17T23:44:50.686403Z INFO Daemon Downloaded certificate {'thumbprint': 'A360B8FF4FB1808EE245880631EB7D508EBB337D', 'hasPrivateKey': True} Apr 17 23:44:50.694312 waagent[1938]: 2026-04-17T23:44:50.687211Z INFO Daemon Fetch goal state completed Apr 17 23:44:50.698889 waagent[1938]: 2026-04-17T23:44:50.698843Z INFO Daemon Daemon Starting provisioning Apr 17 23:44:50.707011 waagent[1938]: 2026-04-17T23:44:50.699055Z INFO Daemon Daemon Handle ovf-env.xml. Apr 17 23:44:50.707011 waagent[1938]: 2026-04-17T23:44:50.700151Z INFO Daemon Daemon Set hostname [ci-4081.3.6-n-7b570e9a3c] Apr 17 23:44:50.708626 waagent[1938]: 2026-04-17T23:44:50.708566Z INFO Daemon Daemon Publish hostname [ci-4081.3.6-n-7b570e9a3c] Apr 17 23:44:50.712496 waagent[1938]: 2026-04-17T23:44:50.712425Z INFO Daemon Daemon Examine /proc/net/route for primary interface Apr 17 23:44:50.718420 waagent[1938]: 2026-04-17T23:44:50.712766Z INFO Daemon Daemon Primary interface is [eth0] Apr 17 23:44:50.738792 systemd-networkd[1395]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 17 23:44:50.738802 systemd-networkd[1395]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 17 23:44:50.740149 waagent[1938]: 2026-04-17T23:44:50.739828Z INFO Daemon Daemon Create user account if not exists Apr 17 23:44:50.738853 systemd-networkd[1395]: eth0: DHCP lease lost Apr 17 23:44:50.743372 waagent[1938]: 2026-04-17T23:44:50.743287Z INFO Daemon Daemon User core already exists, skip useradd Apr 17 23:44:50.760283 waagent[1938]: 2026-04-17T23:44:50.744041Z INFO Daemon Daemon Configure sudoer Apr 17 23:44:50.760283 waagent[1938]: 2026-04-17T23:44:50.745389Z INFO Daemon Daemon Configure sshd Apr 17 23:44:50.760283 waagent[1938]: 2026-04-17T23:44:50.746325Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Apr 17 23:44:50.760283 waagent[1938]: 2026-04-17T23:44:50.747288Z INFO Daemon Daemon Deploy ssh public key. Apr 17 23:44:50.760590 systemd-networkd[1395]: eth0: DHCPv6 lease lost Apr 17 23:44:50.791532 systemd-networkd[1395]: eth0: DHCPv4 address 10.0.0.10/24, gateway 10.0.0.1 acquired from 168.63.129.16 Apr 17 23:44:51.852763 waagent[1938]: 2026-04-17T23:44:51.852707Z INFO Daemon Daemon Provisioning complete Apr 17 23:44:51.864009 waagent[1938]: 2026-04-17T23:44:51.863956Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Apr 17 23:44:51.872016 waagent[1938]: 2026-04-17T23:44:51.864224Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Apr 17 23:44:51.872016 waagent[1938]: 2026-04-17T23:44:51.865213Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.9.1.1 is the most current agent Apr 17 23:44:51.988548 waagent[2037]: 2026-04-17T23:44:51.988430Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.9.1.1) Apr 17 23:44:51.988985 waagent[2037]: 2026-04-17T23:44:51.988621Z INFO ExtHandler ExtHandler OS: flatcar 4081.3.6 Apr 17 23:44:51.988985 waagent[2037]: 2026-04-17T23:44:51.988710Z INFO ExtHandler ExtHandler Python: 3.11.9 Apr 17 23:44:52.027545 waagent[2037]: 2026-04-17T23:44:52.027432Z INFO ExtHandler ExtHandler Distro: flatcar-4081.3.6; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.9; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.20.1; Apr 17 23:44:52.027785 waagent[2037]: 2026-04-17T23:44:52.027734Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Apr 17 23:44:52.027883 waagent[2037]: 2026-04-17T23:44:52.027841Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Apr 17 23:44:52.034827 waagent[2037]: 2026-04-17T23:44:52.034758Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Apr 17 23:44:52.039609 waagent[2037]: 2026-04-17T23:44:52.039553Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.181 Apr 17 23:44:52.040055 waagent[2037]: 2026-04-17T23:44:52.039999Z INFO ExtHandler Apr 17 23:44:52.040135 waagent[2037]: 2026-04-17T23:44:52.040092Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 9cb349d3-9216-49c7-ba20-007106799ed6 eTag: 17100985401297773409 source: Fabric] Apr 17 23:44:52.040440 waagent[2037]: 2026-04-17T23:44:52.040388Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Apr 17 23:44:52.041050 waagent[2037]: 2026-04-17T23:44:52.040994Z INFO ExtHandler Apr 17 23:44:52.041126 waagent[2037]: 2026-04-17T23:44:52.041083Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Apr 17 23:44:52.044185 waagent[2037]: 2026-04-17T23:44:52.044144Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Apr 17 23:44:52.102826 waagent[2037]: 2026-04-17T23:44:52.102686Z INFO ExtHandler Downloaded certificate {'thumbprint': 'A360B8FF4FB1808EE245880631EB7D508EBB337D', 'hasPrivateKey': True} Apr 17 23:44:52.103281 waagent[2037]: 2026-04-17T23:44:52.103226Z INFO ExtHandler Fetch goal state completed Apr 17 23:44:52.121052 waagent[2037]: 2026-04-17T23:44:52.120976Z INFO ExtHandler ExtHandler WALinuxAgent-2.9.1.1 running as process 2037 Apr 17 23:44:52.121237 waagent[2037]: 2026-04-17T23:44:52.121182Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Apr 17 23:44:52.122841 waagent[2037]: 2026-04-17T23:44:52.122781Z INFO ExtHandler ExtHandler Cgroup monitoring is not supported on ['flatcar', '4081.3.6', '', 'Flatcar Container Linux by Kinvolk'] Apr 17 23:44:52.123205 waagent[2037]: 2026-04-17T23:44:52.123154Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Apr 17 23:44:52.158441 waagent[2037]: 2026-04-17T23:44:52.158391Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Apr 17 23:44:52.158710 waagent[2037]: 2026-04-17T23:44:52.158647Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Apr 17 23:44:52.165342 waagent[2037]: 2026-04-17T23:44:52.165297Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Apr 17 23:44:52.172374 systemd[1]: Reloading requested from client PID 2050 ('systemctl') (unit waagent.service)... Apr 17 23:44:52.172393 systemd[1]: Reloading... Apr 17 23:44:52.250512 zram_generator::config[2084]: No configuration found. Apr 17 23:44:52.381458 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 17 23:44:52.461640 systemd[1]: Reloading finished in 288 ms. Apr 17 23:44:52.486498 waagent[2037]: 2026-04-17T23:44:52.486287Z INFO ExtHandler ExtHandler Executing systemctl daemon-reload for setting up waagent-network-setup.service Apr 17 23:44:52.494361 systemd[1]: Reloading requested from client PID 2146 ('systemctl') (unit waagent.service)... Apr 17 23:44:52.494379 systemd[1]: Reloading... Apr 17 23:44:52.575517 zram_generator::config[2180]: No configuration found. Apr 17 23:44:52.701565 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 17 23:44:52.781347 systemd[1]: Reloading finished in 286 ms. Apr 17 23:44:52.806746 waagent[2037]: 2026-04-17T23:44:52.805651Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Apr 17 23:44:52.806746 waagent[2037]: 2026-04-17T23:44:52.805865Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Apr 17 23:44:53.109065 waagent[2037]: 2026-04-17T23:44:53.108975Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Apr 17 23:44:53.109745 waagent[2037]: 2026-04-17T23:44:53.109688Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: configuration enabled [True], cgroups enabled [False], python supported: [True] Apr 17 23:44:53.110586 waagent[2037]: 2026-04-17T23:44:53.110417Z INFO ExtHandler ExtHandler Starting env monitor service. Apr 17 23:44:53.110831 waagent[2037]: 2026-04-17T23:44:53.110786Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Apr 17 23:44:53.111102 waagent[2037]: 2026-04-17T23:44:53.111040Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Apr 17 23:44:53.111257 waagent[2037]: 2026-04-17T23:44:53.111209Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Apr 17 23:44:53.111426 waagent[2037]: 2026-04-17T23:44:53.111336Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Apr 17 23:44:53.111766 waagent[2037]: 2026-04-17T23:44:53.111708Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Apr 17 23:44:53.111864 waagent[2037]: 2026-04-17T23:44:53.111818Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Apr 17 23:44:53.112044 waagent[2037]: 2026-04-17T23:44:53.111951Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Apr 17 23:44:53.112307 waagent[2037]: 2026-04-17T23:44:53.112259Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Apr 17 23:44:53.112811 waagent[2037]: 2026-04-17T23:44:53.112751Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Apr 17 23:44:53.112886 waagent[2037]: 2026-04-17T23:44:53.112841Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Apr 17 23:44:53.113319 waagent[2037]: 2026-04-17T23:44:53.113275Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Apr 17 23:44:53.114678 waagent[2037]: 2026-04-17T23:44:53.114624Z INFO EnvHandler ExtHandler Configure routes Apr 17 23:44:53.114812 waagent[2037]: 2026-04-17T23:44:53.114752Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Apr 17 23:44:53.114812 waagent[2037]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Apr 17 23:44:53.114812 waagent[2037]: eth0 00000000 0100000A 0003 0 0 1024 00000000 0 0 0 Apr 17 23:44:53.114812 waagent[2037]: eth0 0000000A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Apr 17 23:44:53.114812 waagent[2037]: eth0 0100000A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Apr 17 23:44:53.114812 waagent[2037]: eth0 10813FA8 0100000A 0007 0 0 1024 FFFFFFFF 0 0 0 Apr 17 23:44:53.114812 waagent[2037]: eth0 FEA9FEA9 0100000A 0007 0 0 1024 FFFFFFFF 0 0 0 Apr 17 23:44:53.115415 waagent[2037]: 2026-04-17T23:44:53.115334Z INFO EnvHandler ExtHandler Gateway:None Apr 17 23:44:53.116987 waagent[2037]: 2026-04-17T23:44:53.116946Z INFO EnvHandler ExtHandler Routes:None Apr 17 23:44:53.118066 waagent[2037]: 2026-04-17T23:44:53.118022Z INFO ExtHandler ExtHandler Apr 17 23:44:53.118186 waagent[2037]: 2026-04-17T23:44:53.118151Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 6a2dfd37-d8d8-449e-9591-1ad00918bde7 correlation c330664e-f858-475b-b4b5-355856258e85 created: 2026-04-17T23:43:55.445122Z] Apr 17 23:44:53.119428 waagent[2037]: 2026-04-17T23:44:53.119286Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Apr 17 23:44:53.121100 waagent[2037]: 2026-04-17T23:44:53.121055Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 3 ms] Apr 17 23:44:53.153392 waagent[2037]: 2026-04-17T23:44:53.153242Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.9.1.1 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: C8C74812-DE09-4628-91CC-9784D5C1E17C;DroppedPackets: 0;UpdateGSErrors: 0;AutoUpdate: 0] Apr 17 23:44:53.176603 waagent[2037]: 2026-04-17T23:44:53.176518Z INFO MonitorHandler ExtHandler Network interfaces: Apr 17 23:44:53.176603 waagent[2037]: Executing ['ip', '-a', '-o', 'link']: Apr 17 23:44:53.176603 waagent[2037]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Apr 17 23:44:53.176603 waagent[2037]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:da:bb:1d brd ff:ff:ff:ff:ff:ff Apr 17 23:44:53.176603 waagent[2037]: 3: enP55636s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 00:0d:3a:da:bb:1d brd ff:ff:ff:ff:ff:ff\ altname enP55636p0s2 Apr 17 23:44:53.176603 waagent[2037]: Executing ['ip', '-4', '-a', '-o', 'address']: Apr 17 23:44:53.176603 waagent[2037]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Apr 17 23:44:53.176603 waagent[2037]: 2: eth0 inet 10.0.0.10/24 metric 1024 brd 10.0.0.255 scope global eth0\ valid_lft forever preferred_lft forever Apr 17 23:44:53.176603 waagent[2037]: Executing ['ip', '-6', '-a', '-o', 'address']: Apr 17 23:44:53.176603 waagent[2037]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Apr 17 23:44:53.176603 waagent[2037]: 2: eth0 inet6 fe80::20d:3aff:feda:bb1d/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Apr 17 23:44:53.237102 waagent[2037]: 2026-04-17T23:44:53.237022Z INFO EnvHandler ExtHandler Successfully added Azure fabric firewall rules. Current Firewall rules: Apr 17 23:44:53.237102 waagent[2037]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Apr 17 23:44:53.237102 waagent[2037]: pkts bytes target prot opt in out source destination Apr 17 23:44:53.237102 waagent[2037]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Apr 17 23:44:53.237102 waagent[2037]: pkts bytes target prot opt in out source destination Apr 17 23:44:53.237102 waagent[2037]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Apr 17 23:44:53.237102 waagent[2037]: pkts bytes target prot opt in out source destination Apr 17 23:44:53.237102 waagent[2037]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Apr 17 23:44:53.237102 waagent[2037]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Apr 17 23:44:53.237102 waagent[2037]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Apr 17 23:44:53.240470 waagent[2037]: 2026-04-17T23:44:53.240410Z INFO EnvHandler ExtHandler Current Firewall rules: Apr 17 23:44:53.240470 waagent[2037]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Apr 17 23:44:53.240470 waagent[2037]: pkts bytes target prot opt in out source destination Apr 17 23:44:53.240470 waagent[2037]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Apr 17 23:44:53.240470 waagent[2037]: pkts bytes target prot opt in out source destination Apr 17 23:44:53.240470 waagent[2037]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Apr 17 23:44:53.240470 waagent[2037]: pkts bytes target prot opt in out source destination Apr 17 23:44:53.240470 waagent[2037]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Apr 17 23:44:53.240470 waagent[2037]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Apr 17 23:44:53.240470 waagent[2037]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Apr 17 23:44:53.240933 waagent[2037]: 2026-04-17T23:44:53.240744Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Apr 17 23:44:59.783233 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Apr 17 23:44:59.788732 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 23:44:59.902681 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 23:44:59.906429 (kubelet)[2285]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 17 23:45:00.601908 kubelet[2285]: E0417 23:45:00.601846 2285 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 17 23:45:00.605832 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 17 23:45:00.606177 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 17 23:45:10.783371 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Apr 17 23:45:10.788701 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 23:45:10.909675 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 23:45:10.922867 (kubelet)[2305]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 17 23:45:10.959435 kubelet[2305]: E0417 23:45:10.959352 2305 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 17 23:45:10.962063 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 17 23:45:10.962397 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 17 23:45:11.223660 chronyd[1797]: Selected source PHC0 Apr 17 23:45:11.515113 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Apr 17 23:45:11.519773 systemd[1]: Started sshd@0-10.0.0.10:22-20.229.252.112:60422.service - OpenSSH per-connection server daemon (20.229.252.112:60422). Apr 17 23:45:11.823102 sshd[2314]: Accepted publickey for core from 20.229.252.112 port 60422 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:45:11.824565 sshd[2314]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:45:11.828698 systemd-logind[1800]: New session 3 of user core. Apr 17 23:45:11.833809 systemd[1]: Started session-3.scope - Session 3 of User core. Apr 17 23:45:11.952166 systemd[1]: Started sshd@1-10.0.0.10:22-20.229.252.112:60436.service - OpenSSH per-connection server daemon (20.229.252.112:60436). Apr 17 23:45:12.070459 sshd[2319]: Accepted publickey for core from 20.229.252.112 port 60436 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:45:12.071924 sshd[2319]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:45:12.076731 systemd-logind[1800]: New session 4 of user core. Apr 17 23:45:12.085736 systemd[1]: Started session-4.scope - Session 4 of User core. Apr 17 23:45:12.186924 sshd[2319]: pam_unix(sshd:session): session closed for user core Apr 17 23:45:12.189911 systemd[1]: sshd@1-10.0.0.10:22-20.229.252.112:60436.service: Deactivated successfully. Apr 17 23:45:12.194014 systemd-logind[1800]: Session 4 logged out. Waiting for processes to exit. Apr 17 23:45:12.195213 systemd[1]: session-4.scope: Deactivated successfully. Apr 17 23:45:12.196228 systemd-logind[1800]: Removed session 4. Apr 17 23:45:12.209035 systemd[1]: Started sshd@2-10.0.0.10:22-20.229.252.112:60444.service - OpenSSH per-connection server daemon (20.229.252.112:60444). Apr 17 23:45:12.323493 sshd[2327]: Accepted publickey for core from 20.229.252.112 port 60444 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:45:12.324099 sshd[2327]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:45:12.328537 systemd-logind[1800]: New session 5 of user core. Apr 17 23:45:12.335728 systemd[1]: Started session-5.scope - Session 5 of User core. Apr 17 23:45:12.432585 sshd[2327]: pam_unix(sshd:session): session closed for user core Apr 17 23:45:12.435378 systemd[1]: sshd@2-10.0.0.10:22-20.229.252.112:60444.service: Deactivated successfully. Apr 17 23:45:12.439690 systemd-logind[1800]: Session 5 logged out. Waiting for processes to exit. Apr 17 23:45:12.440857 systemd[1]: session-5.scope: Deactivated successfully. Apr 17 23:45:12.441908 systemd-logind[1800]: Removed session 5. Apr 17 23:45:12.454028 systemd[1]: Started sshd@3-10.0.0.10:22-20.229.252.112:60458.service - OpenSSH per-connection server daemon (20.229.252.112:60458). Apr 17 23:45:12.566448 sshd[2335]: Accepted publickey for core from 20.229.252.112 port 60458 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:45:12.567920 sshd[2335]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:45:12.572921 systemd-logind[1800]: New session 6 of user core. Apr 17 23:45:12.580753 systemd[1]: Started session-6.scope - Session 6 of User core. Apr 17 23:45:12.681687 sshd[2335]: pam_unix(sshd:session): session closed for user core Apr 17 23:45:12.686089 systemd[1]: sshd@3-10.0.0.10:22-20.229.252.112:60458.service: Deactivated successfully. Apr 17 23:45:12.687680 systemd-logind[1800]: Session 6 logged out. Waiting for processes to exit. Apr 17 23:45:12.690104 systemd[1]: session-6.scope: Deactivated successfully. Apr 17 23:45:12.691034 systemd-logind[1800]: Removed session 6. Apr 17 23:45:12.703016 systemd[1]: Started sshd@4-10.0.0.10:22-20.229.252.112:60472.service - OpenSSH per-connection server daemon (20.229.252.112:60472). Apr 17 23:45:12.817129 sshd[2343]: Accepted publickey for core from 20.229.252.112 port 60472 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:45:12.817744 sshd[2343]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:45:12.822458 systemd-logind[1800]: New session 7 of user core. Apr 17 23:45:12.828746 systemd[1]: Started session-7.scope - Session 7 of User core. Apr 17 23:45:13.043635 sudo[2347]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Apr 17 23:45:13.044018 sudo[2347]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 17 23:45:13.077019 sudo[2347]: pam_unix(sudo:session): session closed for user root Apr 17 23:45:13.093700 sshd[2343]: pam_unix(sshd:session): session closed for user core Apr 17 23:45:13.097204 systemd[1]: sshd@4-10.0.0.10:22-20.229.252.112:60472.service: Deactivated successfully. Apr 17 23:45:13.101678 systemd-logind[1800]: Session 7 logged out. Waiting for processes to exit. Apr 17 23:45:13.102150 systemd[1]: session-7.scope: Deactivated successfully. Apr 17 23:45:13.104008 systemd-logind[1800]: Removed session 7. Apr 17 23:45:13.115748 systemd[1]: Started sshd@5-10.0.0.10:22-20.229.252.112:60486.service - OpenSSH per-connection server daemon (20.229.252.112:60486). Apr 17 23:45:13.231083 sshd[2352]: Accepted publickey for core from 20.229.252.112 port 60486 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:45:13.232686 sshd[2352]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:45:13.237798 systemd-logind[1800]: New session 8 of user core. Apr 17 23:45:13.243721 systemd[1]: Started session-8.scope - Session 8 of User core. Apr 17 23:45:13.328867 sudo[2357]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Apr 17 23:45:13.329242 sudo[2357]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 17 23:45:13.332945 sudo[2357]: pam_unix(sudo:session): session closed for user root Apr 17 23:45:13.338112 sudo[2356]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Apr 17 23:45:13.338471 sudo[2356]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 17 23:45:13.349793 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Apr 17 23:45:13.353877 auditctl[2360]: No rules Apr 17 23:45:13.354268 systemd[1]: audit-rules.service: Deactivated successfully. Apr 17 23:45:13.354542 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Apr 17 23:45:13.366861 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 17 23:45:13.389760 augenrules[2379]: No rules Apr 17 23:45:13.391531 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 17 23:45:13.394690 sudo[2356]: pam_unix(sudo:session): session closed for user root Apr 17 23:45:13.411664 sshd[2352]: pam_unix(sshd:session): session closed for user core Apr 17 23:45:13.415113 systemd-logind[1800]: Session 8 logged out. Waiting for processes to exit. Apr 17 23:45:13.415907 systemd[1]: sshd@5-10.0.0.10:22-20.229.252.112:60486.service: Deactivated successfully. Apr 17 23:45:13.418629 systemd[1]: session-8.scope: Deactivated successfully. Apr 17 23:45:13.420142 systemd-logind[1800]: Removed session 8. Apr 17 23:45:13.438919 systemd[1]: Started sshd@6-10.0.0.10:22-20.229.252.112:60490.service - OpenSSH per-connection server daemon (20.229.252.112:60490). Apr 17 23:45:13.551084 sshd[2388]: Accepted publickey for core from 20.229.252.112 port 60490 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:45:13.552562 sshd[2388]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:45:13.557416 systemd-logind[1800]: New session 9 of user core. Apr 17 23:45:13.564817 systemd[1]: Started session-9.scope - Session 9 of User core. Apr 17 23:45:13.651339 sudo[2392]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Apr 17 23:45:13.651735 sudo[2392]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 17 23:45:15.380811 systemd[1]: Starting docker.service - Docker Application Container Engine... Apr 17 23:45:15.390951 (dockerd)[2408]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Apr 17 23:45:17.358235 dockerd[2408]: time="2026-04-17T23:45:17.358172000Z" level=info msg="Starting up" Apr 17 23:45:18.371572 dockerd[2408]: time="2026-04-17T23:45:18.371521100Z" level=info msg="Loading containers: start." Apr 17 23:45:18.564501 kernel: Initializing XFRM netlink socket Apr 17 23:45:18.674953 systemd-networkd[1395]: docker0: Link UP Apr 17 23:45:18.701736 dockerd[2408]: time="2026-04-17T23:45:18.701693400Z" level=info msg="Loading containers: done." Apr 17 23:45:18.763803 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4182149434-merged.mount: Deactivated successfully. Apr 17 23:45:18.768010 dockerd[2408]: time="2026-04-17T23:45:18.767968200Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Apr 17 23:45:18.768111 dockerd[2408]: time="2026-04-17T23:45:18.768089500Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Apr 17 23:45:18.768227 dockerd[2408]: time="2026-04-17T23:45:18.768204500Z" level=info msg="Daemon has completed initialization" Apr 17 23:45:18.833271 dockerd[2408]: time="2026-04-17T23:45:18.833078800Z" level=info msg="API listen on /run/docker.sock" Apr 17 23:45:18.833415 systemd[1]: Started docker.service - Docker Application Container Engine. Apr 17 23:45:19.645682 containerd[1834]: time="2026-04-17T23:45:19.645642403Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.11\"" Apr 17 23:45:20.422028 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3444576848.mount: Deactivated successfully. Apr 17 23:45:21.033632 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Apr 17 23:45:21.040565 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 23:45:21.195430 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 23:45:21.208842 (kubelet)[2602]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 17 23:45:21.803569 kubelet[2602]: E0417 23:45:21.803512 2602 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 17 23:45:21.806137 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 17 23:45:21.806467 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 17 23:45:22.915849 containerd[1834]: time="2026-04-17T23:45:22.915787407Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:22.918573 containerd[1834]: time="2026-04-17T23:45:22.918364448Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.11: active requests=0, bytes read=30193997" Apr 17 23:45:22.922630 containerd[1834]: time="2026-04-17T23:45:22.922593115Z" level=info msg="ImageCreate event name:\"sha256:7ea99c30f23b106a042b6c46e565fddb42b20bbe58ba6852e562eed03477aec2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:22.930713 containerd[1834]: time="2026-04-17T23:45:22.929992532Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:18e9f2b6e4d67c24941e14b2d41ec0aa6e5f628e39f2ef2163e176de85bbe39e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:22.931105 containerd[1834]: time="2026-04-17T23:45:22.931061949Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.11\" with image id \"sha256:7ea99c30f23b106a042b6c46e565fddb42b20bbe58ba6852e562eed03477aec2\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:18e9f2b6e4d67c24941e14b2d41ec0aa6e5f628e39f2ef2163e176de85bbe39e\", size \"30190588\" in 3.285377445s" Apr 17 23:45:22.931175 containerd[1834]: time="2026-04-17T23:45:22.931116649Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.11\" returns image reference \"sha256:7ea99c30f23b106a042b6c46e565fddb42b20bbe58ba6852e562eed03477aec2\"" Apr 17 23:45:22.931711 containerd[1834]: time="2026-04-17T23:45:22.931684458Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.11\"" Apr 17 23:45:24.788404 containerd[1834]: time="2026-04-17T23:45:24.788345514Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:24.791216 containerd[1834]: time="2026-04-17T23:45:24.791153258Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.11: active requests=0, bytes read=26171455" Apr 17 23:45:24.795022 containerd[1834]: time="2026-04-17T23:45:24.794966219Z" level=info msg="ImageCreate event name:\"sha256:c75dc8a6c47e2f7491fa2e367879f53c6f46053066e6b7135df4b154ddd94a1f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:24.800466 containerd[1834]: time="2026-04-17T23:45:24.800411305Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7579451c5b3c2715da4a263c5d80a3367a24fdc12e86fde6851674d567d1dfb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:24.801833 containerd[1834]: time="2026-04-17T23:45:24.801530123Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.11\" with image id \"sha256:c75dc8a6c47e2f7491fa2e367879f53c6f46053066e6b7135df4b154ddd94a1f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7579451c5b3c2715da4a263c5d80a3367a24fdc12e86fde6851674d567d1dfb2\", size \"27737794\" in 1.869809063s" Apr 17 23:45:24.801833 containerd[1834]: time="2026-04-17T23:45:24.801586323Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.11\" returns image reference \"sha256:c75dc8a6c47e2f7491fa2e367879f53c6f46053066e6b7135df4b154ddd94a1f\"" Apr 17 23:45:24.802439 containerd[1834]: time="2026-04-17T23:45:24.802406136Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.11\"" Apr 17 23:45:26.392851 containerd[1834]: time="2026-04-17T23:45:26.392787682Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:26.395676 containerd[1834]: time="2026-04-17T23:45:26.395462924Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.11: active requests=0, bytes read=20289764" Apr 17 23:45:26.399316 containerd[1834]: time="2026-04-17T23:45:26.399256784Z" level=info msg="ImageCreate event name:\"sha256:3febad3451e2d599688a8ad13d19d03c48c9054be209342c748fac2bb6c56f97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:26.403881 containerd[1834]: time="2026-04-17T23:45:26.403818056Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:5506f0f94c4d9aeb071664893aabc12166bcb7f775008a6fff02d004e6091d28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:26.405276 containerd[1834]: time="2026-04-17T23:45:26.404838072Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.11\" with image id \"sha256:3febad3451e2d599688a8ad13d19d03c48c9054be209342c748fac2bb6c56f97\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:5506f0f94c4d9aeb071664893aabc12166bcb7f775008a6fff02d004e6091d28\", size \"21856121\" in 1.602392035s" Apr 17 23:45:26.405276 containerd[1834]: time="2026-04-17T23:45:26.404881373Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.11\" returns image reference \"sha256:3febad3451e2d599688a8ad13d19d03c48c9054be209342c748fac2bb6c56f97\"" Apr 17 23:45:26.405475 containerd[1834]: time="2026-04-17T23:45:26.405452482Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.11\"" Apr 17 23:45:27.752347 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2431919413.mount: Deactivated successfully. Apr 17 23:45:27.943504 kernel: hv_balloon: Max. dynamic memory size: 8192 MB Apr 17 23:45:28.293497 containerd[1834]: time="2026-04-17T23:45:28.293440874Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:28.296269 containerd[1834]: time="2026-04-17T23:45:28.296211627Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.11: active requests=0, bytes read=32010719" Apr 17 23:45:28.299226 containerd[1834]: time="2026-04-17T23:45:28.299173884Z" level=info msg="ImageCreate event name:\"sha256:4ce1332df15d2a0b1c2d3b18292afb4ff670070401211daebb00b7293b26f6d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:28.305635 containerd[1834]: time="2026-04-17T23:45:28.305583806Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8d18637b5c5f58a4ca0163d3cf184e53d4c522963c242860562be7cb25e9303e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:28.306225 containerd[1834]: time="2026-04-17T23:45:28.306042215Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.11\" with image id \"sha256:4ce1332df15d2a0b1c2d3b18292afb4ff670070401211daebb00b7293b26f6d0\", repo tag \"registry.k8s.io/kube-proxy:v1.33.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:8d18637b5c5f58a4ca0163d3cf184e53d4c522963c242860562be7cb25e9303e\", size \"32009730\" in 1.900533232s" Apr 17 23:45:28.306225 containerd[1834]: time="2026-04-17T23:45:28.306087016Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.11\" returns image reference \"sha256:4ce1332df15d2a0b1c2d3b18292afb4ff670070401211daebb00b7293b26f6d0\"" Apr 17 23:45:28.306754 containerd[1834]: time="2026-04-17T23:45:28.306709328Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Apr 17 23:45:28.946143 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount897438700.mount: Deactivated successfully. Apr 17 23:45:30.298595 containerd[1834]: time="2026-04-17T23:45:30.298540933Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:30.301096 containerd[1834]: time="2026-04-17T23:45:30.301040781Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942246" Apr 17 23:45:30.303881 containerd[1834]: time="2026-04-17T23:45:30.303826334Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:30.308596 containerd[1834]: time="2026-04-17T23:45:30.308548325Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:30.309683 containerd[1834]: time="2026-04-17T23:45:30.309646346Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 2.002899017s" Apr 17 23:45:30.309770 containerd[1834]: time="2026-04-17T23:45:30.309691147Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Apr 17 23:45:30.310431 containerd[1834]: time="2026-04-17T23:45:30.310144855Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Apr 17 23:45:30.911677 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2621672494.mount: Deactivated successfully. Apr 17 23:45:30.929757 containerd[1834]: time="2026-04-17T23:45:30.929681538Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:30.932300 containerd[1834]: time="2026-04-17T23:45:30.932245188Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Apr 17 23:45:30.935586 containerd[1834]: time="2026-04-17T23:45:30.935528551Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:30.939836 containerd[1834]: time="2026-04-17T23:45:30.939789132Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:30.940843 containerd[1834]: time="2026-04-17T23:45:30.940457045Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 630.282189ms" Apr 17 23:45:30.940843 containerd[1834]: time="2026-04-17T23:45:30.940504246Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Apr 17 23:45:30.941323 containerd[1834]: time="2026-04-17T23:45:30.941295961Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Apr 17 23:45:31.516329 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4291336855.mount: Deactivated successfully. Apr 17 23:45:32.033320 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Apr 17 23:45:32.042953 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 23:45:32.225675 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 23:45:32.225918 (kubelet)[2719]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 17 23:45:32.937822 kubelet[2719]: E0417 23:45:32.937764 2719 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 17 23:45:32.940326 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 17 23:45:32.940677 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 17 23:45:33.274645 update_engine[1805]: I20260417 23:45:33.274521 1805 update_attempter.cc:509] Updating boot flags... Apr 17 23:45:33.348503 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 32 scanned by (udev-worker) (2759) Apr 17 23:45:33.458791 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 32 scanned by (udev-worker) (2758) Apr 17 23:45:34.147938 containerd[1834]: time="2026-04-17T23:45:34.147881581Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:34.151212 containerd[1834]: time="2026-04-17T23:45:34.151141708Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=23719434" Apr 17 23:45:34.155788 containerd[1834]: time="2026-04-17T23:45:34.155731647Z" level=info msg="ImageCreate event name:\"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:34.161232 containerd[1834]: time="2026-04-17T23:45:34.161078992Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:34.162344 containerd[1834]: time="2026-04-17T23:45:34.162194601Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"23716032\" in 3.220861339s" Apr 17 23:45:34.162344 containerd[1834]: time="2026-04-17T23:45:34.162233402Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\"" Apr 17 23:45:37.618998 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 23:45:37.625998 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 23:45:37.667141 systemd[1]: Reloading requested from client PID 2873 ('systemctl') (unit session-9.scope)... Apr 17 23:45:37.667160 systemd[1]: Reloading... Apr 17 23:45:37.794526 zram_generator::config[2912]: No configuration found. Apr 17 23:45:37.926305 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 17 23:45:38.005656 systemd[1]: Reloading finished in 337 ms. Apr 17 23:45:38.057891 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Apr 17 23:45:38.058187 systemd[1]: kubelet.service: Failed with result 'signal'. Apr 17 23:45:38.058685 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 23:45:38.061849 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 23:45:38.446703 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 23:45:38.458841 (kubelet)[2995]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 17 23:45:39.239924 kubelet[2995]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 23:45:39.239924 kubelet[2995]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 23:45:39.239924 kubelet[2995]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 23:45:39.239924 kubelet[2995]: I0417 23:45:39.239745 2995 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 23:45:39.461216 kubelet[2995]: I0417 23:45:39.461171 2995 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Apr 17 23:45:39.461216 kubelet[2995]: I0417 23:45:39.461215 2995 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 23:45:39.462496 kubelet[2995]: I0417 23:45:39.461908 2995 server.go:956] "Client rotation is on, will bootstrap in background" Apr 17 23:45:39.491610 kubelet[2995]: E0417 23:45:39.491514 2995 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.10:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.10:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Apr 17 23:45:39.494336 kubelet[2995]: I0417 23:45:39.493978 2995 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 17 23:45:39.499026 kubelet[2995]: E0417 23:45:39.498994 2995 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 17 23:45:39.499026 kubelet[2995]: I0417 23:45:39.499022 2995 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Apr 17 23:45:39.503512 kubelet[2995]: I0417 23:45:39.503489 2995 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Apr 17 23:45:39.503901 kubelet[2995]: I0417 23:45:39.503857 2995 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 23:45:39.504070 kubelet[2995]: I0417 23:45:39.503890 2995 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081.3.6-n-7b570e9a3c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":1} Apr 17 23:45:39.504210 kubelet[2995]: I0417 23:45:39.504073 2995 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 23:45:39.504210 kubelet[2995]: I0417 23:45:39.504088 2995 container_manager_linux.go:303] "Creating device plugin manager" Apr 17 23:45:39.504288 kubelet[2995]: I0417 23:45:39.504225 2995 state_mem.go:36] "Initialized new in-memory state store" Apr 17 23:45:39.507920 kubelet[2995]: I0417 23:45:39.507899 2995 kubelet.go:480] "Attempting to sync node with API server" Apr 17 23:45:39.508011 kubelet[2995]: I0417 23:45:39.507924 2995 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 23:45:39.508011 kubelet[2995]: I0417 23:45:39.507957 2995 kubelet.go:386] "Adding apiserver pod source" Apr 17 23:45:39.509896 kubelet[2995]: I0417 23:45:39.509511 2995 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 23:45:39.515584 kubelet[2995]: E0417 23:45:39.515262 2995 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.10:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081.3.6-n-7b570e9a3c&limit=500&resourceVersion=0\": dial tcp 10.0.0.10:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 23:45:39.515934 kubelet[2995]: E0417 23:45:39.515903 2995 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.10:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.10:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 23:45:39.516035 kubelet[2995]: I0417 23:45:39.515992 2995 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 17 23:45:39.516980 kubelet[2995]: I0417 23:45:39.516788 2995 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 23:45:39.518498 kubelet[2995]: W0417 23:45:39.517594 2995 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Apr 17 23:45:39.522795 kubelet[2995]: I0417 23:45:39.522773 2995 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 23:45:39.522876 kubelet[2995]: I0417 23:45:39.522850 2995 server.go:1289] "Started kubelet" Apr 17 23:45:39.523069 kubelet[2995]: I0417 23:45:39.523020 2995 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 23:45:39.524069 kubelet[2995]: I0417 23:45:39.524025 2995 server.go:317] "Adding debug handlers to kubelet server" Apr 17 23:45:39.526918 kubelet[2995]: I0417 23:45:39.526302 2995 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 23:45:39.526993 kubelet[2995]: I0417 23:45:39.526845 2995 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 23:45:39.532707 kubelet[2995]: E0417 23:45:39.530601 2995 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.10:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.10:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081.3.6-n-7b570e9a3c.18a749a2518f1090 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081.3.6-n-7b570e9a3c,UID:ci-4081.3.6-n-7b570e9a3c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081.3.6-n-7b570e9a3c,},FirstTimestamp:2026-04-17 23:45:39.52281 +0000 UTC m=+1.059941825,LastTimestamp:2026-04-17 23:45:39.52281 +0000 UTC m=+1.059941825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081.3.6-n-7b570e9a3c,}" Apr 17 23:45:39.535267 kubelet[2995]: I0417 23:45:39.535124 2995 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 23:45:39.535513 kubelet[2995]: I0417 23:45:39.535471 2995 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 17 23:45:39.541178 kubelet[2995]: I0417 23:45:39.541156 2995 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 23:45:39.541588 kubelet[2995]: E0417 23:45:39.541432 2995 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-7b570e9a3c\" not found" Apr 17 23:45:39.541662 kubelet[2995]: I0417 23:45:39.541633 2995 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 23:45:39.541713 kubelet[2995]: I0417 23:45:39.541681 2995 reconciler.go:26] "Reconciler: start to sync state" Apr 17 23:45:39.542639 kubelet[2995]: E0417 23:45:39.542462 2995 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.10:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081.3.6-n-7b570e9a3c?timeout=10s\": dial tcp 10.0.0.10:6443: connect: connection refused" interval="200ms" Apr 17 23:45:39.542754 kubelet[2995]: E0417 23:45:39.542717 2995 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.10:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.10:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 23:45:39.542965 kubelet[2995]: I0417 23:45:39.542940 2995 factory.go:223] Registration of the systemd container factory successfully Apr 17 23:45:39.543044 kubelet[2995]: I0417 23:45:39.543023 2995 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 17 23:45:39.543454 kubelet[2995]: E0417 23:45:39.543430 2995 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 17 23:45:39.545708 kubelet[2995]: I0417 23:45:39.544612 2995 factory.go:223] Registration of the containerd container factory successfully Apr 17 23:45:39.579547 kubelet[2995]: I0417 23:45:39.579493 2995 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 23:45:39.582060 kubelet[2995]: I0417 23:45:39.582034 2995 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 23:45:39.582060 kubelet[2995]: I0417 23:45:39.582061 2995 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 23:45:39.582546 kubelet[2995]: I0417 23:45:39.582304 2995 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 23:45:39.582546 kubelet[2995]: I0417 23:45:39.582321 2995 kubelet.go:2436] "Starting kubelet main sync loop" Apr 17 23:45:39.582546 kubelet[2995]: E0417 23:45:39.582372 2995 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 17 23:45:39.585168 kubelet[2995]: E0417 23:45:39.584849 2995 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.10:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.10:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 17 23:45:39.637452 kubelet[2995]: I0417 23:45:39.637422 2995 cpu_manager.go:221] "Starting CPU manager" policy="none" Apr 17 23:45:39.637452 kubelet[2995]: I0417 23:45:39.637443 2995 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Apr 17 23:45:39.637643 kubelet[2995]: I0417 23:45:39.637466 2995 state_mem.go:36] "Initialized new in-memory state store" Apr 17 23:45:39.642034 kubelet[2995]: E0417 23:45:39.641952 2995 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081.3.6-n-7b570e9a3c\" not found" Apr 17 23:45:39.647603 kubelet[2995]: I0417 23:45:39.647579 2995 policy_none.go:49] "None policy: Start" Apr 17 23:45:39.647603 kubelet[2995]: I0417 23:45:39.647602 2995 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 23:45:39.647720 kubelet[2995]: I0417 23:45:39.647615 2995 state_mem.go:35] "Initializing new in-memory state store" Apr 17 23:45:39.656335 kubelet[2995]: E0417 23:45:39.656297 2995 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 23:45:39.656655 kubelet[2995]: I0417 23:45:39.656516 2995 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 23:45:39.656655 kubelet[2995]: I0417 23:45:39.656533 2995 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 23:45:39.658373 kubelet[2995]: I0417 23:45:39.658263 2995 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 23:45:39.659304 kubelet[2995]: E0417 23:45:39.659274 2995 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 17 23:45:39.659385 kubelet[2995]: E0417 23:45:39.659324 2995 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081.3.6-n-7b570e9a3c\" not found" Apr 17 23:45:39.694464 kubelet[2995]: E0417 23:45:39.694232 2995 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-7b570e9a3c\" not found" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.697996 kubelet[2995]: E0417 23:45:39.697969 2995 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-7b570e9a3c\" not found" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.704327 kubelet[2995]: E0417 23:45:39.704305 2995 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-7b570e9a3c\" not found" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.743067 kubelet[2995]: E0417 23:45:39.742933 2995 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.10:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081.3.6-n-7b570e9a3c?timeout=10s\": dial tcp 10.0.0.10:6443: connect: connection refused" interval="400ms" Apr 17 23:45:39.759572 kubelet[2995]: I0417 23:45:39.759539 2995 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.759887 kubelet[2995]: E0417 23:45:39.759847 2995 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.10:6443/api/v1/nodes\": dial tcp 10.0.0.10:6443: connect: connection refused" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.843549 kubelet[2995]: I0417 23:45:39.843500 2995 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/098510ca48ac58c65fcc7a2941a2bb22-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" (UID: \"098510ca48ac58c65fcc7a2941a2bb22\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.843549 kubelet[2995]: I0417 23:45:39.843557 2995 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/912e86c9583d1a0f7c1dbf046a31aaa7-ca-certs\") pod \"kube-apiserver-ci-4081.3.6-n-7b570e9a3c\" (UID: \"912e86c9583d1a0f7c1dbf046a31aaa7\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.843834 kubelet[2995]: I0417 23:45:39.843585 2995 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/912e86c9583d1a0f7c1dbf046a31aaa7-k8s-certs\") pod \"kube-apiserver-ci-4081.3.6-n-7b570e9a3c\" (UID: \"912e86c9583d1a0f7c1dbf046a31aaa7\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.843834 kubelet[2995]: I0417 23:45:39.843627 2995 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/912e86c9583d1a0f7c1dbf046a31aaa7-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081.3.6-n-7b570e9a3c\" (UID: \"912e86c9583d1a0f7c1dbf046a31aaa7\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.843834 kubelet[2995]: I0417 23:45:39.843653 2995 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/098510ca48ac58c65fcc7a2941a2bb22-ca-certs\") pod \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" (UID: \"098510ca48ac58c65fcc7a2941a2bb22\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.843834 kubelet[2995]: I0417 23:45:39.843673 2995 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/098510ca48ac58c65fcc7a2941a2bb22-flexvolume-dir\") pod \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" (UID: \"098510ca48ac58c65fcc7a2941a2bb22\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.843834 kubelet[2995]: I0417 23:45:39.843692 2995 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/098510ca48ac58c65fcc7a2941a2bb22-k8s-certs\") pod \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" (UID: \"098510ca48ac58c65fcc7a2941a2bb22\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.843999 kubelet[2995]: I0417 23:45:39.843713 2995 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/098510ca48ac58c65fcc7a2941a2bb22-kubeconfig\") pod \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" (UID: \"098510ca48ac58c65fcc7a2941a2bb22\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.843999 kubelet[2995]: I0417 23:45:39.843731 2995 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/47204f90011e134f9e01dc8033820d59-kubeconfig\") pod \"kube-scheduler-ci-4081.3.6-n-7b570e9a3c\" (UID: \"47204f90011e134f9e01dc8033820d59\") " pod="kube-system/kube-scheduler-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.961828 kubelet[2995]: I0417 23:45:39.961797 2995 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.962195 kubelet[2995]: E0417 23:45:39.962161 2995 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.10:6443/api/v1/nodes\": dial tcp 10.0.0.10:6443: connect: connection refused" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:39.995501 containerd[1834]: time="2026-04-17T23:45:39.995354809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081.3.6-n-7b570e9a3c,Uid:912e86c9583d1a0f7c1dbf046a31aaa7,Namespace:kube-system,Attempt:0,}" Apr 17 23:45:40.001021 containerd[1834]: time="2026-04-17T23:45:40.000984093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081.3.6-n-7b570e9a3c,Uid:098510ca48ac58c65fcc7a2941a2bb22,Namespace:kube-system,Attempt:0,}" Apr 17 23:45:40.005829 containerd[1834]: time="2026-04-17T23:45:40.005504060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081.3.6-n-7b570e9a3c,Uid:47204f90011e134f9e01dc8033820d59,Namespace:kube-system,Attempt:0,}" Apr 17 23:45:40.144021 kubelet[2995]: E0417 23:45:40.143981 2995 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.10:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081.3.6-n-7b570e9a3c?timeout=10s\": dial tcp 10.0.0.10:6443: connect: connection refused" interval="800ms" Apr 17 23:45:40.364158 kubelet[2995]: I0417 23:45:40.364126 2995 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:40.364645 kubelet[2995]: E0417 23:45:40.364499 2995 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.10:6443/api/v1/nodes\": dial tcp 10.0.0.10:6443: connect: connection refused" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:40.607391 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3820564045.mount: Deactivated successfully. Apr 17 23:45:40.663239 containerd[1834]: time="2026-04-17T23:45:40.663105215Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 17 23:45:40.666147 containerd[1834]: time="2026-04-17T23:45:40.666038458Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Apr 17 23:45:40.669627 containerd[1834]: time="2026-04-17T23:45:40.669587311Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 17 23:45:40.672521 containerd[1834]: time="2026-04-17T23:45:40.672314851Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 17 23:45:40.676572 containerd[1834]: time="2026-04-17T23:45:40.676525814Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 17 23:45:40.679873 containerd[1834]: time="2026-04-17T23:45:40.679833663Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 17 23:45:40.682858 containerd[1834]: time="2026-04-17T23:45:40.682589904Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 17 23:45:40.687804 containerd[1834]: time="2026-04-17T23:45:40.687774981Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 17 23:45:40.688570 containerd[1834]: time="2026-04-17T23:45:40.688531392Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 687.475198ms" Apr 17 23:45:40.690041 containerd[1834]: time="2026-04-17T23:45:40.690003614Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 694.559203ms" Apr 17 23:45:40.692728 containerd[1834]: time="2026-04-17T23:45:40.692696254Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 687.133893ms" Apr 17 23:45:40.748911 kubelet[2995]: E0417 23:45:40.748859 2995 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.10:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.10:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 23:45:40.759620 kubelet[2995]: E0417 23:45:40.759577 2995 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.10:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081.3.6-n-7b570e9a3c&limit=500&resourceVersion=0\": dial tcp 10.0.0.10:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 23:45:40.945555 kubelet[2995]: E0417 23:45:40.945406 2995 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.10:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081.3.6-n-7b570e9a3c?timeout=10s\": dial tcp 10.0.0.10:6443: connect: connection refused" interval="1.6s" Apr 17 23:45:40.984444 kubelet[2995]: E0417 23:45:40.984400 2995 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.10:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.10:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 17 23:45:41.075322 kubelet[2995]: E0417 23:45:41.075274 2995 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.10:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.10:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 23:45:41.168501 kubelet[2995]: I0417 23:45:41.168421 2995 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:41.169984 kubelet[2995]: E0417 23:45:41.168829 2995 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.10:6443/api/v1/nodes\": dial tcp 10.0.0.10:6443: connect: connection refused" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:41.390546 containerd[1834]: time="2026-04-17T23:45:41.390353803Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:45:41.390546 containerd[1834]: time="2026-04-17T23:45:41.390422304Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:45:41.390546 containerd[1834]: time="2026-04-17T23:45:41.390457205Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:45:41.391268 containerd[1834]: time="2026-04-17T23:45:41.390677508Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:45:41.401870 containerd[1834]: time="2026-04-17T23:45:41.401670571Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:45:41.401870 containerd[1834]: time="2026-04-17T23:45:41.401812673Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:45:41.403365 containerd[1834]: time="2026-04-17T23:45:41.402476383Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:45:41.403365 containerd[1834]: time="2026-04-17T23:45:41.402609485Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:45:41.404100 containerd[1834]: time="2026-04-17T23:45:41.403842203Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:45:41.404100 containerd[1834]: time="2026-04-17T23:45:41.403898804Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:45:41.404100 containerd[1834]: time="2026-04-17T23:45:41.403920804Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:45:41.404100 containerd[1834]: time="2026-04-17T23:45:41.404014906Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:45:41.499505 kubelet[2995]: E0417 23:45:41.499159 2995 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.10:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.10:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Apr 17 23:45:41.511279 containerd[1834]: time="2026-04-17T23:45:41.510930692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081.3.6-n-7b570e9a3c,Uid:912e86c9583d1a0f7c1dbf046a31aaa7,Namespace:kube-system,Attempt:0,} returns sandbox id \"db497f7610c10c6966ad49fb5272a95f2f1eb8f7ba291fedc87dda069e5ffc5f\"" Apr 17 23:45:41.520908 containerd[1834]: time="2026-04-17T23:45:41.520597635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081.3.6-n-7b570e9a3c,Uid:47204f90011e134f9e01dc8033820d59,Namespace:kube-system,Attempt:0,} returns sandbox id \"7a7e8199da3c89682d81af57688dbfb6b846b0b1a14909050eedeb1300037598\"" Apr 17 23:45:41.524625 containerd[1834]: time="2026-04-17T23:45:41.524587694Z" level=info msg="CreateContainer within sandbox \"db497f7610c10c6966ad49fb5272a95f2f1eb8f7ba291fedc87dda069e5ffc5f\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Apr 17 23:45:41.524910 containerd[1834]: time="2026-04-17T23:45:41.524689296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081.3.6-n-7b570e9a3c,Uid:098510ca48ac58c65fcc7a2941a2bb22,Namespace:kube-system,Attempt:0,} returns sandbox id \"dff92f14260cece060bc574e5b8219d5e615045ce6254dad6e332e911d437d5e\"" Apr 17 23:45:41.532396 containerd[1834]: time="2026-04-17T23:45:41.532362110Z" level=info msg="CreateContainer within sandbox \"7a7e8199da3c89682d81af57688dbfb6b846b0b1a14909050eedeb1300037598\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Apr 17 23:45:41.537462 containerd[1834]: time="2026-04-17T23:45:41.537166381Z" level=info msg="CreateContainer within sandbox \"dff92f14260cece060bc574e5b8219d5e615045ce6254dad6e332e911d437d5e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Apr 17 23:45:41.565852 containerd[1834]: time="2026-04-17T23:45:41.565812706Z" level=info msg="CreateContainer within sandbox \"db497f7610c10c6966ad49fb5272a95f2f1eb8f7ba291fedc87dda069e5ffc5f\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"5e2cee0583fce5596559566739ad924689e737a259720792cefc48f3518839fa\"" Apr 17 23:45:41.566684 containerd[1834]: time="2026-04-17T23:45:41.566654718Z" level=info msg="StartContainer for \"5e2cee0583fce5596559566739ad924689e737a259720792cefc48f3518839fa\"" Apr 17 23:45:41.622359 containerd[1834]: time="2026-04-17T23:45:41.622310244Z" level=info msg="CreateContainer within sandbox \"7a7e8199da3c89682d81af57688dbfb6b846b0b1a14909050eedeb1300037598\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"43afc3614120551a2748e5f4ee5f35f6fd9f55e1c7f917080f73725ceb4a12b6\"" Apr 17 23:45:41.623158 containerd[1834]: time="2026-04-17T23:45:41.623129256Z" level=info msg="StartContainer for \"43afc3614120551a2748e5f4ee5f35f6fd9f55e1c7f917080f73725ceb4a12b6\"" Apr 17 23:45:41.656193 containerd[1834]: time="2026-04-17T23:45:41.655915942Z" level=info msg="CreateContainer within sandbox \"dff92f14260cece060bc574e5b8219d5e615045ce6254dad6e332e911d437d5e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"2f0bbdcf093c1d51aeec2680be03e31bef36f553f06862211d4759ce41bfd103\"" Apr 17 23:45:41.659500 containerd[1834]: time="2026-04-17T23:45:41.658068374Z" level=info msg="StartContainer for \"2f0bbdcf093c1d51aeec2680be03e31bef36f553f06862211d4759ce41bfd103\"" Apr 17 23:45:41.685695 containerd[1834]: time="2026-04-17T23:45:41.684560667Z" level=info msg="StartContainer for \"5e2cee0583fce5596559566739ad924689e737a259720792cefc48f3518839fa\" returns successfully" Apr 17 23:45:41.790046 containerd[1834]: time="2026-04-17T23:45:41.789931230Z" level=info msg="StartContainer for \"43afc3614120551a2748e5f4ee5f35f6fd9f55e1c7f917080f73725ceb4a12b6\" returns successfully" Apr 17 23:45:41.823462 containerd[1834]: time="2026-04-17T23:45:41.823406427Z" level=info msg="StartContainer for \"2f0bbdcf093c1d51aeec2680be03e31bef36f553f06862211d4759ce41bfd103\" returns successfully" Apr 17 23:45:42.612141 kubelet[2995]: E0417 23:45:42.612104 2995 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-7b570e9a3c\" not found" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:42.622240 kubelet[2995]: E0417 23:45:42.621113 2995 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-7b570e9a3c\" not found" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:42.622240 kubelet[2995]: E0417 23:45:42.621575 2995 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081.3.6-n-7b570e9a3c\" not found" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:42.773695 kubelet[2995]: I0417 23:45:42.773665 2995 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.328015 kubelet[2995]: E0417 23:45:43.327968 2995 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081.3.6-n-7b570e9a3c\" not found" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.441981 kubelet[2995]: I0417 23:45:43.441379 2995 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.441981 kubelet[2995]: E0417 23:45:43.441419 2995 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4081.3.6-n-7b570e9a3c\": node \"ci-4081.3.6-n-7b570e9a3c\" not found" Apr 17 23:45:43.444524 kubelet[2995]: I0417 23:45:43.442815 2995 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.454743 kubelet[2995]: E0417 23:45:43.454544 2995 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081.3.6-n-7b570e9a3c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.454743 kubelet[2995]: I0417 23:45:43.454669 2995 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.459927 kubelet[2995]: E0417 23:45:43.459689 2995 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081.3.6-n-7b570e9a3c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.459927 kubelet[2995]: I0417 23:45:43.459726 2995 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.461937 kubelet[2995]: E0417 23:45:43.461885 2995 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.517351 kubelet[2995]: I0417 23:45:43.517070 2995 apiserver.go:52] "Watching apiserver" Apr 17 23:45:43.542660 kubelet[2995]: I0417 23:45:43.542617 2995 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 23:45:43.620077 kubelet[2995]: I0417 23:45:43.619682 2995 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.620077 kubelet[2995]: I0417 23:45:43.619794 2995 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.622249 kubelet[2995]: E0417 23:45:43.622149 2995 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081.3.6-n-7b570e9a3c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.622494 kubelet[2995]: I0417 23:45:43.622386 2995 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.624382 kubelet[2995]: E0417 23:45:43.624348 2995 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:43.624714 kubelet[2995]: E0417 23:45:43.624677 2995 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081.3.6-n-7b570e9a3c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:44.622988 kubelet[2995]: I0417 23:45:44.622541 2995 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:44.622988 kubelet[2995]: I0417 23:45:44.622798 2995 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:44.630018 kubelet[2995]: I0417 23:45:44.629937 2995 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 23:45:44.634434 kubelet[2995]: I0417 23:45:44.634412 2995 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 23:45:45.721845 systemd[1]: Reloading requested from client PID 3274 ('systemctl') (unit session-9.scope)... Apr 17 23:45:45.721864 systemd[1]: Reloading... Apr 17 23:45:45.812093 zram_generator::config[3314]: No configuration found. Apr 17 23:45:45.962009 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 17 23:45:46.051105 systemd[1]: Reloading finished in 328 ms. Apr 17 23:45:46.085456 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 23:45:46.107528 systemd[1]: kubelet.service: Deactivated successfully. Apr 17 23:45:46.108140 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 23:45:46.115814 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 17 23:45:46.254912 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 17 23:45:46.268944 (kubelet)[3392]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 17 23:45:46.310855 kubelet[3392]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 23:45:46.311523 kubelet[3392]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 23:45:46.311523 kubelet[3392]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 23:45:46.311725 kubelet[3392]: I0417 23:45:46.311636 3392 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 23:45:46.317902 kubelet[3392]: I0417 23:45:46.317869 3392 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Apr 17 23:45:46.317902 kubelet[3392]: I0417 23:45:46.317894 3392 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 23:45:46.318162 kubelet[3392]: I0417 23:45:46.318141 3392 server.go:956] "Client rotation is on, will bootstrap in background" Apr 17 23:45:46.319304 kubelet[3392]: I0417 23:45:46.319279 3392 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Apr 17 23:45:46.322545 kubelet[3392]: I0417 23:45:46.321922 3392 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 17 23:45:46.325360 kubelet[3392]: E0417 23:45:46.325320 3392 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 17 23:45:46.325452 kubelet[3392]: I0417 23:45:46.325364 3392 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Apr 17 23:45:46.331224 kubelet[3392]: I0417 23:45:46.330825 3392 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Apr 17 23:45:46.331408 kubelet[3392]: I0417 23:45:46.331343 3392 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 23:45:46.331614 kubelet[3392]: I0417 23:45:46.331384 3392 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081.3.6-n-7b570e9a3c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":1} Apr 17 23:45:46.331767 kubelet[3392]: I0417 23:45:46.331623 3392 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 23:45:46.331767 kubelet[3392]: I0417 23:45:46.331638 3392 container_manager_linux.go:303] "Creating device plugin manager" Apr 17 23:45:46.331767 kubelet[3392]: I0417 23:45:46.331694 3392 state_mem.go:36] "Initialized new in-memory state store" Apr 17 23:45:46.331890 kubelet[3392]: I0417 23:45:46.331880 3392 kubelet.go:480] "Attempting to sync node with API server" Apr 17 23:45:46.332546 kubelet[3392]: I0417 23:45:46.331896 3392 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 23:45:46.332546 kubelet[3392]: I0417 23:45:46.331926 3392 kubelet.go:386] "Adding apiserver pod source" Apr 17 23:45:46.332546 kubelet[3392]: I0417 23:45:46.331946 3392 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 23:45:46.334999 kubelet[3392]: I0417 23:45:46.334977 3392 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 17 23:45:46.335672 kubelet[3392]: I0417 23:45:46.335650 3392 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 23:45:46.345679 kubelet[3392]: I0417 23:45:46.345657 3392 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 23:45:46.345771 kubelet[3392]: I0417 23:45:46.345697 3392 server.go:1289] "Started kubelet" Apr 17 23:45:46.348102 kubelet[3392]: I0417 23:45:46.348082 3392 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 23:45:46.354956 kubelet[3392]: I0417 23:45:46.354927 3392 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 23:45:46.356247 kubelet[3392]: I0417 23:45:46.356053 3392 server.go:317] "Adding debug handlers to kubelet server" Apr 17 23:45:46.359702 kubelet[3392]: I0417 23:45:46.359685 3392 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 23:45:46.360170 kubelet[3392]: I0417 23:45:46.360107 3392 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 23:45:46.360413 kubelet[3392]: I0417 23:45:46.360392 3392 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 23:45:46.360745 kubelet[3392]: I0417 23:45:46.360722 3392 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 17 23:45:46.365588 kubelet[3392]: I0417 23:45:46.365569 3392 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 23:45:46.365806 kubelet[3392]: I0417 23:45:46.365794 3392 reconciler.go:26] "Reconciler: start to sync state" Apr 17 23:45:46.367703 kubelet[3392]: I0417 23:45:46.367678 3392 factory.go:223] Registration of the systemd container factory successfully Apr 17 23:45:46.368003 kubelet[3392]: I0417 23:45:46.367791 3392 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 17 23:45:46.368780 kubelet[3392]: I0417 23:45:46.368758 3392 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 23:45:46.370591 kubelet[3392]: I0417 23:45:46.370573 3392 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 23:45:46.370771 kubelet[3392]: I0417 23:45:46.370679 3392 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 23:45:46.370771 kubelet[3392]: I0417 23:45:46.370707 3392 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 23:45:46.370771 kubelet[3392]: I0417 23:45:46.370716 3392 kubelet.go:2436] "Starting kubelet main sync loop" Apr 17 23:45:46.371131 kubelet[3392]: E0417 23:45:46.370761 3392 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 17 23:45:46.373525 kubelet[3392]: I0417 23:45:46.373506 3392 factory.go:223] Registration of the containerd container factory successfully Apr 17 23:45:46.904835 kubelet[3392]: E0417 23:45:46.904589 3392 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 17 23:45:46.982810 kubelet[3392]: E0417 23:45:46.910280 3392 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 17 23:45:46.982810 kubelet[3392]: I0417 23:45:46.962125 3392 cpu_manager.go:221] "Starting CPU manager" policy="none" Apr 17 23:45:46.982810 kubelet[3392]: I0417 23:45:46.962147 3392 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Apr 17 23:45:46.982810 kubelet[3392]: I0417 23:45:46.962176 3392 state_mem.go:36] "Initialized new in-memory state store" Apr 17 23:45:46.982810 kubelet[3392]: I0417 23:45:46.981688 3392 state_mem.go:88] "Updated default CPUSet" cpuSet="" Apr 17 23:45:46.982810 kubelet[3392]: I0417 23:45:46.981711 3392 state_mem.go:96] "Updated CPUSet assignments" assignments={} Apr 17 23:45:46.982810 kubelet[3392]: I0417 23:45:46.981734 3392 policy_none.go:49] "None policy: Start" Apr 17 23:45:46.982810 kubelet[3392]: I0417 23:45:46.981749 3392 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 23:45:46.982810 kubelet[3392]: I0417 23:45:46.981764 3392 state_mem.go:35] "Initializing new in-memory state store" Apr 17 23:45:46.982810 kubelet[3392]: I0417 23:45:46.981873 3392 state_mem.go:75] "Updated machine memory state" Apr 17 23:45:46.983460 kubelet[3392]: E0417 23:45:46.983432 3392 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 23:45:46.983673 kubelet[3392]: I0417 23:45:46.983656 3392 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 23:45:46.983738 kubelet[3392]: I0417 23:45:46.983677 3392 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 23:45:46.986426 kubelet[3392]: I0417 23:45:46.984779 3392 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 23:45:46.989020 kubelet[3392]: E0417 23:45:46.987144 3392 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 17 23:45:47.094010 kubelet[3392]: I0417 23:45:47.093048 3392 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.106322 kubelet[3392]: I0417 23:45:47.106013 3392 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.106610 kubelet[3392]: I0417 23:45:47.106570 3392 kubelet_node_status.go:124] "Node was previously registered" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.106662 kubelet[3392]: I0417 23:45:47.106642 3392 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.109497 kubelet[3392]: I0417 23:45:47.108743 3392 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.112418 kubelet[3392]: I0417 23:45:47.112297 3392 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.120212 kubelet[3392]: I0417 23:45:47.119952 3392 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 23:45:47.127364 kubelet[3392]: I0417 23:45:47.126783 3392 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 23:45:47.127364 kubelet[3392]: I0417 23:45:47.126878 3392 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 23:45:47.127364 kubelet[3392]: E0417 23:45:47.126924 3392 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081.3.6-n-7b570e9a3c\" already exists" pod="kube-system/kube-scheduler-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.127364 kubelet[3392]: E0417 23:45:47.127244 3392 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081.3.6-n-7b570e9a3c\" already exists" pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.207429 kubelet[3392]: I0417 23:45:47.206869 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/912e86c9583d1a0f7c1dbf046a31aaa7-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081.3.6-n-7b570e9a3c\" (UID: \"912e86c9583d1a0f7c1dbf046a31aaa7\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.207429 kubelet[3392]: I0417 23:45:47.206922 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/098510ca48ac58c65fcc7a2941a2bb22-ca-certs\") pod \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" (UID: \"098510ca48ac58c65fcc7a2941a2bb22\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.207429 kubelet[3392]: I0417 23:45:47.206954 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/098510ca48ac58c65fcc7a2941a2bb22-flexvolume-dir\") pod \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" (UID: \"098510ca48ac58c65fcc7a2941a2bb22\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.207429 kubelet[3392]: I0417 23:45:47.206978 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/098510ca48ac58c65fcc7a2941a2bb22-kubeconfig\") pod \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" (UID: \"098510ca48ac58c65fcc7a2941a2bb22\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.207429 kubelet[3392]: I0417 23:45:47.207008 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/098510ca48ac58c65fcc7a2941a2bb22-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" (UID: \"098510ca48ac58c65fcc7a2941a2bb22\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.207758 kubelet[3392]: I0417 23:45:47.207032 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/912e86c9583d1a0f7c1dbf046a31aaa7-ca-certs\") pod \"kube-apiserver-ci-4081.3.6-n-7b570e9a3c\" (UID: \"912e86c9583d1a0f7c1dbf046a31aaa7\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.207758 kubelet[3392]: I0417 23:45:47.207055 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/912e86c9583d1a0f7c1dbf046a31aaa7-k8s-certs\") pod \"kube-apiserver-ci-4081.3.6-n-7b570e9a3c\" (UID: \"912e86c9583d1a0f7c1dbf046a31aaa7\") " pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.207758 kubelet[3392]: I0417 23:45:47.207083 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/098510ca48ac58c65fcc7a2941a2bb22-k8s-certs\") pod \"kube-controller-manager-ci-4081.3.6-n-7b570e9a3c\" (UID: \"098510ca48ac58c65fcc7a2941a2bb22\") " pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.207758 kubelet[3392]: I0417 23:45:47.207108 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/47204f90011e134f9e01dc8033820d59-kubeconfig\") pod \"kube-scheduler-ci-4081.3.6-n-7b570e9a3c\" (UID: \"47204f90011e134f9e01dc8033820d59\") " pod="kube-system/kube-scheduler-ci-4081.3.6-n-7b570e9a3c" Apr 17 23:45:47.340036 kubelet[3392]: I0417 23:45:47.339275 3392 apiserver.go:52] "Watching apiserver" Apr 17 23:45:47.366507 kubelet[3392]: I0417 23:45:47.366461 3392 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 23:45:47.468696 kubelet[3392]: I0417 23:45:47.468546 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081.3.6-n-7b570e9a3c" podStartSLOduration=0.468515551 podStartE2EDuration="468.515551ms" podCreationTimestamp="2026-04-17 23:45:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 23:45:47.467869745 +0000 UTC m=+1.193835687" watchObservedRunningTime="2026-04-17 23:45:47.468515551 +0000 UTC m=+1.194481593" Apr 17 23:45:47.490124 kubelet[3392]: I0417 23:45:47.490062 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081.3.6-n-7b570e9a3c" podStartSLOduration=3.490048255 podStartE2EDuration="3.490048255s" podCreationTimestamp="2026-04-17 23:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 23:45:47.480846768 +0000 UTC m=+1.206812810" watchObservedRunningTime="2026-04-17 23:45:47.490048255 +0000 UTC m=+1.216014197" Apr 17 23:45:47.490427 kubelet[3392]: I0417 23:45:47.490199 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081.3.6-n-7b570e9a3c" podStartSLOduration=3.490193156 podStartE2EDuration="3.490193156s" podCreationTimestamp="2026-04-17 23:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 23:45:47.489893353 +0000 UTC m=+1.215859295" watchObservedRunningTime="2026-04-17 23:45:47.490193156 +0000 UTC m=+1.216159098" Apr 17 23:45:50.237616 kubelet[3392]: I0417 23:45:50.237572 3392 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Apr 17 23:45:50.238517 containerd[1834]: time="2026-04-17T23:45:50.238441337Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Apr 17 23:45:50.238945 kubelet[3392]: I0417 23:45:50.238714 3392 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Apr 17 23:45:50.729132 kubelet[3392]: I0417 23:45:50.728974 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8688509a-cf76-42e9-ae2e-3636e116c28b-kube-proxy\") pod \"kube-proxy-cvsgh\" (UID: \"8688509a-cf76-42e9-ae2e-3636e116c28b\") " pod="kube-system/kube-proxy-cvsgh" Apr 17 23:45:50.729132 kubelet[3392]: I0417 23:45:50.729023 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8688509a-cf76-42e9-ae2e-3636e116c28b-xtables-lock\") pod \"kube-proxy-cvsgh\" (UID: \"8688509a-cf76-42e9-ae2e-3636e116c28b\") " pod="kube-system/kube-proxy-cvsgh" Apr 17 23:45:50.729132 kubelet[3392]: I0417 23:45:50.729048 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8688509a-cf76-42e9-ae2e-3636e116c28b-lib-modules\") pod \"kube-proxy-cvsgh\" (UID: \"8688509a-cf76-42e9-ae2e-3636e116c28b\") " pod="kube-system/kube-proxy-cvsgh" Apr 17 23:45:50.729132 kubelet[3392]: I0417 23:45:50.729078 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb57s\" (UniqueName: \"kubernetes.io/projected/8688509a-cf76-42e9-ae2e-3636e116c28b-kube-api-access-mb57s\") pod \"kube-proxy-cvsgh\" (UID: \"8688509a-cf76-42e9-ae2e-3636e116c28b\") " pod="kube-system/kube-proxy-cvsgh" Apr 17 23:45:50.835022 kubelet[3392]: E0417 23:45:50.834986 3392 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Apr 17 23:45:50.835022 kubelet[3392]: E0417 23:45:50.835019 3392 projected.go:194] Error preparing data for projected volume kube-api-access-mb57s for pod kube-system/kube-proxy-cvsgh: configmap "kube-root-ca.crt" not found Apr 17 23:45:50.835235 kubelet[3392]: E0417 23:45:50.835095 3392 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8688509a-cf76-42e9-ae2e-3636e116c28b-kube-api-access-mb57s podName:8688509a-cf76-42e9-ae2e-3636e116c28b nodeName:}" failed. No retries permitted until 2026-04-17 23:45:51.335071377 +0000 UTC m=+5.061037319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mb57s" (UniqueName: "kubernetes.io/projected/8688509a-cf76-42e9-ae2e-3636e116c28b-kube-api-access-mb57s") pod "kube-proxy-cvsgh" (UID: "8688509a-cf76-42e9-ae2e-3636e116c28b") : configmap "kube-root-ca.crt" not found Apr 17 23:45:51.535096 kubelet[3392]: I0417 23:45:51.535032 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/dd889ac1-fb09-430c-8fca-446212ea9645-var-lib-calico\") pod \"tigera-operator-6bf85f8dd-7cp82\" (UID: \"dd889ac1-fb09-430c-8fca-446212ea9645\") " pod="tigera-operator/tigera-operator-6bf85f8dd-7cp82" Apr 17 23:45:51.535096 kubelet[3392]: I0417 23:45:51.535093 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brbdc\" (UniqueName: \"kubernetes.io/projected/dd889ac1-fb09-430c-8fca-446212ea9645-kube-api-access-brbdc\") pod \"tigera-operator-6bf85f8dd-7cp82\" (UID: \"dd889ac1-fb09-430c-8fca-446212ea9645\") " pod="tigera-operator/tigera-operator-6bf85f8dd-7cp82" Apr 17 23:45:51.587906 containerd[1834]: time="2026-04-17T23:45:51.587863023Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cvsgh,Uid:8688509a-cf76-42e9-ae2e-3636e116c28b,Namespace:kube-system,Attempt:0,}" Apr 17 23:45:51.631096 containerd[1834]: time="2026-04-17T23:45:51.630579058Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:45:51.631096 containerd[1834]: time="2026-04-17T23:45:51.630640659Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:45:51.631096 containerd[1834]: time="2026-04-17T23:45:51.630677659Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:45:51.631096 containerd[1834]: time="2026-04-17T23:45:51.630775260Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:45:51.680630 containerd[1834]: time="2026-04-17T23:45:51.680581068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cvsgh,Uid:8688509a-cf76-42e9-ae2e-3636e116c28b,Namespace:kube-system,Attempt:0,} returns sandbox id \"b201e2c78efb46f6edf3e17cb4ec7895c2d5aa9ba49e358a5a18c0bf07ccfbe4\"" Apr 17 23:45:51.690112 containerd[1834]: time="2026-04-17T23:45:51.690070165Z" level=info msg="CreateContainer within sandbox \"b201e2c78efb46f6edf3e17cb4ec7895c2d5aa9ba49e358a5a18c0bf07ccfbe4\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Apr 17 23:45:51.719929 containerd[1834]: time="2026-04-17T23:45:51.719892369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-7cp82,Uid:dd889ac1-fb09-430c-8fca-446212ea9645,Namespace:tigera-operator,Attempt:0,}" Apr 17 23:45:51.721052 containerd[1834]: time="2026-04-17T23:45:51.721017280Z" level=info msg="CreateContainer within sandbox \"b201e2c78efb46f6edf3e17cb4ec7895c2d5aa9ba49e358a5a18c0bf07ccfbe4\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b917b0e31f08a6ebaac17685d964703be95f18794b1d902f094c1ffeb7be0140\"" Apr 17 23:45:51.721642 containerd[1834]: time="2026-04-17T23:45:51.721541386Z" level=info msg="StartContainer for \"b917b0e31f08a6ebaac17685d964703be95f18794b1d902f094c1ffeb7be0140\"" Apr 17 23:45:51.783113 containerd[1834]: time="2026-04-17T23:45:51.782821310Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:45:51.783113 containerd[1834]: time="2026-04-17T23:45:51.782909011Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:45:51.783113 containerd[1834]: time="2026-04-17T23:45:51.782938012Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:45:51.783113 containerd[1834]: time="2026-04-17T23:45:51.783030513Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:45:51.835894 containerd[1834]: time="2026-04-17T23:45:51.835759650Z" level=info msg="StartContainer for \"b917b0e31f08a6ebaac17685d964703be95f18794b1d902f094c1ffeb7be0140\" returns successfully" Apr 17 23:45:51.861061 containerd[1834]: time="2026-04-17T23:45:51.861021308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-7cp82,Uid:dd889ac1-fb09-430c-8fca-446212ea9645,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f44f37c50ff94b1608f9e2e25c3f5a853ac1c0676ee45a619364bf2ef82b8292\"" Apr 17 23:45:51.864723 containerd[1834]: time="2026-04-17T23:45:51.864692745Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Apr 17 23:45:51.969501 kubelet[3392]: I0417 23:45:51.969386 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-cvsgh" podStartSLOduration=1.969363612 podStartE2EDuration="1.969363612s" podCreationTimestamp="2026-04-17 23:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 23:45:51.968627105 +0000 UTC m=+5.694593147" watchObservedRunningTime="2026-04-17 23:45:51.969363612 +0000 UTC m=+5.695329554" Apr 17 23:45:53.427202 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1049680512.mount: Deactivated successfully. Apr 17 23:45:54.729730 containerd[1834]: time="2026-04-17T23:45:54.729663355Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:54.732733 containerd[1834]: time="2026-04-17T23:45:54.732663986Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Apr 17 23:45:54.736951 containerd[1834]: time="2026-04-17T23:45:54.736892129Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:54.741356 containerd[1834]: time="2026-04-17T23:45:54.741300374Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:45:54.742165 containerd[1834]: time="2026-04-17T23:45:54.741995181Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.877025633s" Apr 17 23:45:54.742165 containerd[1834]: time="2026-04-17T23:45:54.742034181Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Apr 17 23:45:54.749677 containerd[1834]: time="2026-04-17T23:45:54.749637459Z" level=info msg="CreateContainer within sandbox \"f44f37c50ff94b1608f9e2e25c3f5a853ac1c0676ee45a619364bf2ef82b8292\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Apr 17 23:45:54.781770 containerd[1834]: time="2026-04-17T23:45:54.781727986Z" level=info msg="CreateContainer within sandbox \"f44f37c50ff94b1608f9e2e25c3f5a853ac1c0676ee45a619364bf2ef82b8292\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"e0b849b245d51a1899d3dac60d90093bcbbfd1b1289bea8a63183b6c5aedb63b\"" Apr 17 23:45:54.783161 containerd[1834]: time="2026-04-17T23:45:54.782248591Z" level=info msg="StartContainer for \"e0b849b245d51a1899d3dac60d90093bcbbfd1b1289bea8a63183b6c5aedb63b\"" Apr 17 23:45:54.844076 containerd[1834]: time="2026-04-17T23:45:54.843969321Z" level=info msg="StartContainer for \"e0b849b245d51a1899d3dac60d90093bcbbfd1b1289bea8a63183b6c5aedb63b\" returns successfully" Apr 17 23:45:54.964979 kubelet[3392]: I0417 23:45:54.964612 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6bf85f8dd-7cp82" podStartSLOduration=1.084020981 podStartE2EDuration="3.96459075s" podCreationTimestamp="2026-04-17 23:45:51 +0000 UTC" firstStartedPulling="2026-04-17 23:45:51.862570424 +0000 UTC m=+5.588536366" lastFinishedPulling="2026-04-17 23:45:54.743140193 +0000 UTC m=+8.469106135" observedRunningTime="2026-04-17 23:45:54.964308547 +0000 UTC m=+8.690274489" watchObservedRunningTime="2026-04-17 23:45:54.96459075 +0000 UTC m=+8.690556692" Apr 17 23:46:01.405944 sudo[2392]: pam_unix(sudo:session): session closed for user root Apr 17 23:46:01.424940 sshd[2388]: pam_unix(sshd:session): session closed for user core Apr 17 23:46:01.436203 systemd[1]: sshd@6-10.0.0.10:22-20.229.252.112:60490.service: Deactivated successfully. Apr 17 23:46:01.448987 systemd-logind[1800]: Session 9 logged out. Waiting for processes to exit. Apr 17 23:46:01.449187 systemd[1]: session-9.scope: Deactivated successfully. Apr 17 23:46:01.453549 systemd-logind[1800]: Removed session 9. Apr 17 23:46:04.418612 kubelet[3392]: I0417 23:46:04.418541 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b605ae21-564c-48fc-9c81-ef139844d6f8-node-certs\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.418612 kubelet[3392]: I0417 23:46:04.418618 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/34ecdd83-e0ab-4599-a5d5-5cfa4fb300cd-typha-certs\") pod \"calico-typha-8645c4df75-j2m2k\" (UID: \"34ecdd83-e0ab-4599-a5d5-5cfa4fb300cd\") " pod="calico-system/calico-typha-8645c4df75-j2m2k" Apr 17 23:46:04.419203 kubelet[3392]: I0417 23:46:04.418652 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-nodeproc\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419203 kubelet[3392]: I0417 23:46:04.418697 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-policysync\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419203 kubelet[3392]: I0417 23:46:04.418725 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-sys-fs\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419203 kubelet[3392]: I0417 23:46:04.418757 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b605ae21-564c-48fc-9c81-ef139844d6f8-tigera-ca-bundle\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419203 kubelet[3392]: I0417 23:46:04.418796 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7krf\" (UniqueName: \"kubernetes.io/projected/34ecdd83-e0ab-4599-a5d5-5cfa4fb300cd-kube-api-access-b7krf\") pod \"calico-typha-8645c4df75-j2m2k\" (UID: \"34ecdd83-e0ab-4599-a5d5-5cfa4fb300cd\") " pod="calico-system/calico-typha-8645c4df75-j2m2k" Apr 17 23:46:04.419353 kubelet[3392]: I0417 23:46:04.418847 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-cni-log-dir\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419353 kubelet[3392]: I0417 23:46:04.418878 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-var-run-calico\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419353 kubelet[3392]: I0417 23:46:04.418913 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34ecdd83-e0ab-4599-a5d5-5cfa4fb300cd-tigera-ca-bundle\") pod \"calico-typha-8645c4df75-j2m2k\" (UID: \"34ecdd83-e0ab-4599-a5d5-5cfa4fb300cd\") " pod="calico-system/calico-typha-8645c4df75-j2m2k" Apr 17 23:46:04.419353 kubelet[3392]: I0417 23:46:04.418942 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-lib-modules\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419353 kubelet[3392]: I0417 23:46:04.418970 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcz7q\" (UniqueName: \"kubernetes.io/projected/b605ae21-564c-48fc-9c81-ef139844d6f8-kube-api-access-hcz7q\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419476 kubelet[3392]: I0417 23:46:04.418990 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-var-lib-calico\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419476 kubelet[3392]: I0417 23:46:04.419010 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-xtables-lock\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419476 kubelet[3392]: I0417 23:46:04.419035 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-bpffs\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419476 kubelet[3392]: I0417 23:46:04.419057 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-cni-bin-dir\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419476 kubelet[3392]: I0417 23:46:04.419075 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-cni-net-dir\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.419663 kubelet[3392]: I0417 23:46:04.419100 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b605ae21-564c-48fc-9c81-ef139844d6f8-flexvol-driver-host\") pod \"calico-node-n24kb\" (UID: \"b605ae21-564c-48fc-9c81-ef139844d6f8\") " pod="calico-system/calico-node-n24kb" Apr 17 23:46:04.521915 kubelet[3392]: E0417 23:46:04.521868 3392 secret.go:189] Couldn't get secret calico-system/node-certs: object "calico-system"/"node-certs" not registered Apr 17 23:46:04.522394 kubelet[3392]: E0417 23:46:04.521963 3392 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b605ae21-564c-48fc-9c81-ef139844d6f8-node-certs podName:b605ae21-564c-48fc-9c81-ef139844d6f8 nodeName:}" failed. No retries permitted until 2026-04-17 23:46:05.021944366 +0000 UTC m=+18.747910308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-certs" (UniqueName: "kubernetes.io/secret/b605ae21-564c-48fc-9c81-ef139844d6f8-node-certs") pod "calico-node-n24kb" (UID: "b605ae21-564c-48fc-9c81-ef139844d6f8") : object "calico-system"/"node-certs" not registered Apr 17 23:46:04.523026 kubelet[3392]: E0417 23:46:04.522869 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.523026 kubelet[3392]: W0417 23:46:04.522890 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.523026 kubelet[3392]: E0417 23:46:04.522912 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.523533 kubelet[3392]: E0417 23:46:04.523370 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.523533 kubelet[3392]: W0417 23:46:04.523384 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.523533 kubelet[3392]: E0417 23:46:04.523400 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.523950 kubelet[3392]: E0417 23:46:04.523823 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.523950 kubelet[3392]: W0417 23:46:04.523836 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.523950 kubelet[3392]: E0417 23:46:04.523851 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.524300 kubelet[3392]: E0417 23:46:04.524198 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.524300 kubelet[3392]: W0417 23:46:04.524208 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.524300 kubelet[3392]: E0417 23:46:04.524220 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.524574 kubelet[3392]: E0417 23:46:04.524561 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.524762 kubelet[3392]: W0417 23:46:04.524648 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.524762 kubelet[3392]: E0417 23:46:04.524667 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.525067 kubelet[3392]: E0417 23:46:04.525053 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.525213 kubelet[3392]: W0417 23:46:04.525147 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.525213 kubelet[3392]: E0417 23:46:04.525170 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.525605 kubelet[3392]: E0417 23:46:04.525459 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.525605 kubelet[3392]: W0417 23:46:04.525475 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.525605 kubelet[3392]: E0417 23:46:04.525505 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.526052 kubelet[3392]: E0417 23:46:04.526003 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.526052 kubelet[3392]: W0417 23:46:04.526024 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.526052 kubelet[3392]: E0417 23:46:04.526037 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.526705 kubelet[3392]: E0417 23:46:04.526593 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.526705 kubelet[3392]: W0417 23:46:04.526607 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.526705 kubelet[3392]: E0417 23:46:04.526623 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.527492 kubelet[3392]: E0417 23:46:04.527387 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.527492 kubelet[3392]: W0417 23:46:04.527400 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.527492 kubelet[3392]: E0417 23:46:04.527414 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.527947 kubelet[3392]: E0417 23:46:04.527845 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.527947 kubelet[3392]: W0417 23:46:04.527860 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.527947 kubelet[3392]: E0417 23:46:04.527889 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.528387 kubelet[3392]: E0417 23:46:04.528281 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.528387 kubelet[3392]: W0417 23:46:04.528295 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.528387 kubelet[3392]: E0417 23:46:04.528307 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.528928 kubelet[3392]: E0417 23:46:04.528780 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.528928 kubelet[3392]: W0417 23:46:04.528794 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.528928 kubelet[3392]: E0417 23:46:04.528813 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.529312 kubelet[3392]: E0417 23:46:04.529226 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.529312 kubelet[3392]: W0417 23:46:04.529241 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.529312 kubelet[3392]: E0417 23:46:04.529253 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.529841 kubelet[3392]: E0417 23:46:04.529682 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.529841 kubelet[3392]: W0417 23:46:04.529695 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.529841 kubelet[3392]: E0417 23:46:04.529710 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.530220 kubelet[3392]: E0417 23:46:04.530144 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.530220 kubelet[3392]: W0417 23:46:04.530157 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.530220 kubelet[3392]: E0417 23:46:04.530170 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.530787 kubelet[3392]: E0417 23:46:04.530673 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.530787 kubelet[3392]: W0417 23:46:04.530687 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.530787 kubelet[3392]: E0417 23:46:04.530703 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.531198 kubelet[3392]: E0417 23:46:04.531082 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.531198 kubelet[3392]: W0417 23:46:04.531094 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.531198 kubelet[3392]: E0417 23:46:04.531107 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.531628 kubelet[3392]: E0417 23:46:04.531517 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.531628 kubelet[3392]: W0417 23:46:04.531530 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.531628 kubelet[3392]: E0417 23:46:04.531543 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.532354 kubelet[3392]: E0417 23:46:04.532240 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.532354 kubelet[3392]: W0417 23:46:04.532255 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.532354 kubelet[3392]: E0417 23:46:04.532268 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.534722 kubelet[3392]: E0417 23:46:04.534605 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.534722 kubelet[3392]: W0417 23:46:04.534622 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.534722 kubelet[3392]: E0417 23:46:04.534636 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.535138 kubelet[3392]: E0417 23:46:04.535037 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.535138 kubelet[3392]: W0417 23:46:04.535051 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.535138 kubelet[3392]: E0417 23:46:04.535064 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.536068 kubelet[3392]: E0417 23:46:04.535415 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.536068 kubelet[3392]: W0417 23:46:04.535431 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.536068 kubelet[3392]: E0417 23:46:04.535444 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.536506 kubelet[3392]: E0417 23:46:04.536381 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.536506 kubelet[3392]: W0417 23:46:04.536396 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.536506 kubelet[3392]: E0417 23:46:04.536409 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.536960 kubelet[3392]: E0417 23:46:04.536803 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.536960 kubelet[3392]: W0417 23:46:04.536817 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.536960 kubelet[3392]: E0417 23:46:04.536830 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.537427 kubelet[3392]: E0417 23:46:04.537249 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.537427 kubelet[3392]: W0417 23:46:04.537262 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.537427 kubelet[3392]: E0417 23:46:04.537275 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.539433 kubelet[3392]: E0417 23:46:04.539321 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.539433 kubelet[3392]: W0417 23:46:04.539336 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.539433 kubelet[3392]: E0417 23:46:04.539351 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.539931 kubelet[3392]: E0417 23:46:04.539831 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.539931 kubelet[3392]: W0417 23:46:04.539844 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.539931 kubelet[3392]: E0417 23:46:04.539857 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.541717 kubelet[3392]: E0417 23:46:04.541569 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.541717 kubelet[3392]: W0417 23:46:04.541584 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.541717 kubelet[3392]: E0417 23:46:04.541599 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.543361 kubelet[3392]: E0417 23:46:04.543148 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.543361 kubelet[3392]: W0417 23:46:04.543164 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.543361 kubelet[3392]: E0417 23:46:04.543179 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.543733 kubelet[3392]: E0417 23:46:04.543548 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.543733 kubelet[3392]: W0417 23:46:04.543560 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.543733 kubelet[3392]: E0417 23:46:04.543573 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.544298 kubelet[3392]: E0417 23:46:04.544156 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.544298 kubelet[3392]: W0417 23:46:04.544170 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.544298 kubelet[3392]: E0417 23:46:04.544185 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.544756 kubelet[3392]: E0417 23:46:04.544648 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.544756 kubelet[3392]: W0417 23:46:04.544663 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.544756 kubelet[3392]: E0417 23:46:04.544676 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.545509 kubelet[3392]: E0417 23:46:04.545138 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.545509 kubelet[3392]: W0417 23:46:04.545153 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.545509 kubelet[3392]: E0417 23:46:04.545169 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.545888 kubelet[3392]: E0417 23:46:04.545874 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.545976 kubelet[3392]: W0417 23:46:04.545963 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.546062 kubelet[3392]: E0417 23:46:04.546038 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.548002 kubelet[3392]: E0417 23:46:04.547567 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.548002 kubelet[3392]: W0417 23:46:04.547585 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.548002 kubelet[3392]: E0417 23:46:04.547601 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.549333 kubelet[3392]: E0417 23:46:04.548971 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.549333 kubelet[3392]: W0417 23:46:04.548989 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.549333 kubelet[3392]: E0417 23:46:04.549004 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.550007 kubelet[3392]: E0417 23:46:04.549872 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.550007 kubelet[3392]: W0417 23:46:04.549885 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.550007 kubelet[3392]: E0417 23:46:04.549898 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.553846 kubelet[3392]: E0417 23:46:04.552590 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.553846 kubelet[3392]: W0417 23:46:04.552604 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.553846 kubelet[3392]: E0417 23:46:04.552619 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.553846 kubelet[3392]: E0417 23:46:04.552865 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.553846 kubelet[3392]: W0417 23:46:04.552875 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.553846 kubelet[3392]: E0417 23:46:04.552887 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.558623 kubelet[3392]: E0417 23:46:04.557742 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.558623 kubelet[3392]: W0417 23:46:04.557758 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.558623 kubelet[3392]: E0417 23:46:04.557796 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.558623 kubelet[3392]: E0417 23:46:04.558092 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.558623 kubelet[3392]: W0417 23:46:04.558119 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.558623 kubelet[3392]: E0417 23:46:04.558134 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.558623 kubelet[3392]: E0417 23:46:04.558415 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.558623 kubelet[3392]: W0417 23:46:04.558434 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.558623 kubelet[3392]: E0417 23:46:04.558447 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.559375 kubelet[3392]: E0417 23:46:04.559072 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.559375 kubelet[3392]: W0417 23:46:04.559087 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.559375 kubelet[3392]: E0417 23:46:04.559118 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.560162 kubelet[3392]: E0417 23:46:04.559908 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.560162 kubelet[3392]: W0417 23:46:04.559923 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.560162 kubelet[3392]: E0417 23:46:04.559952 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.562135 kubelet[3392]: E0417 23:46:04.560862 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.562135 kubelet[3392]: W0417 23:46:04.560879 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.562135 kubelet[3392]: E0417 23:46:04.560894 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.565365 kubelet[3392]: E0417 23:46:04.565351 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.566871 kubelet[3392]: W0417 23:46:04.566442 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.566871 kubelet[3392]: E0417 23:46:04.566468 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.569137 kubelet[3392]: E0417 23:46:04.567549 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.569137 kubelet[3392]: W0417 23:46:04.567565 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.569137 kubelet[3392]: E0417 23:46:04.567579 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.569751 kubelet[3392]: E0417 23:46:04.569547 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.569751 kubelet[3392]: W0417 23:46:04.569559 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.569751 kubelet[3392]: E0417 23:46:04.569571 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.570041 kubelet[3392]: E0417 23:46:04.570028 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.570122 kubelet[3392]: W0417 23:46:04.570111 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.570204 kubelet[3392]: E0417 23:46:04.570192 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.570596 kubelet[3392]: E0417 23:46:04.570474 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.570596 kubelet[3392]: W0417 23:46:04.570527 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.570596 kubelet[3392]: E0417 23:46:04.570542 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.571375 kubelet[3392]: E0417 23:46:04.571261 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.571375 kubelet[3392]: W0417 23:46:04.571274 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.571585 kubelet[3392]: E0417 23:46:04.571511 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.572278 kubelet[3392]: E0417 23:46:04.572227 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.572278 kubelet[3392]: W0417 23:46:04.572241 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.572278 kubelet[3392]: E0417 23:46:04.572255 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.574203 kubelet[3392]: E0417 23:46:04.574187 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.574575 kubelet[3392]: W0417 23:46:04.574557 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.574930 kubelet[3392]: E0417 23:46:04.574680 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.634728 containerd[1834]: time="2026-04-17T23:46:04.634596777Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8645c4df75-j2m2k,Uid:34ecdd83-e0ab-4599-a5d5-5cfa4fb300cd,Namespace:calico-system,Attempt:0,}" Apr 17 23:46:04.650268 kubelet[3392]: E0417 23:46:04.650235 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.650268 kubelet[3392]: W0417 23:46:04.650261 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.650458 kubelet[3392]: E0417 23:46:04.650286 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.681750 kubelet[3392]: E0417 23:46:04.681380 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:04.692973 kubelet[3392]: E0417 23:46:04.692471 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.692973 kubelet[3392]: W0417 23:46:04.692838 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.694261 kubelet[3392]: E0417 23:46:04.692865 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.696525 kubelet[3392]: E0417 23:46:04.696297 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.696525 kubelet[3392]: W0417 23:46:04.696316 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.699615 kubelet[3392]: E0417 23:46:04.697077 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.700611 kubelet[3392]: E0417 23:46:04.700591 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.700704 kubelet[3392]: W0417 23:46:04.700611 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.700704 kubelet[3392]: E0417 23:46:04.700632 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.704660 kubelet[3392]: E0417 23:46:04.703932 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.704660 kubelet[3392]: W0417 23:46:04.703955 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.704660 kubelet[3392]: E0417 23:46:04.703978 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.706220 kubelet[3392]: E0417 23:46:04.705044 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.706220 kubelet[3392]: W0417 23:46:04.705075 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.706220 kubelet[3392]: E0417 23:46:04.705093 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.706943 kubelet[3392]: E0417 23:46:04.706684 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.706943 kubelet[3392]: W0417 23:46:04.706700 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.706943 kubelet[3392]: E0417 23:46:04.706737 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.707619 kubelet[3392]: E0417 23:46:04.707597 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.707619 kubelet[3392]: W0417 23:46:04.707618 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.707619 kubelet[3392]: E0417 23:46:04.707633 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.710573 kubelet[3392]: E0417 23:46:04.710553 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.710573 kubelet[3392]: W0417 23:46:04.710588 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.710573 kubelet[3392]: E0417 23:46:04.710606 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.710573 kubelet[3392]: E0417 23:46:04.711606 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.710573 kubelet[3392]: W0417 23:46:04.711619 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.718524 kubelet[3392]: E0417 23:46:04.714518 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.718524 kubelet[3392]: E0417 23:46:04.716653 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.718524 kubelet[3392]: W0417 23:46:04.716665 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.718524 kubelet[3392]: E0417 23:46:04.716680 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.718524 kubelet[3392]: E0417 23:46:04.716891 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.718524 kubelet[3392]: W0417 23:46:04.716901 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.718524 kubelet[3392]: E0417 23:46:04.716913 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.718524 kubelet[3392]: E0417 23:46:04.717381 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.718524 kubelet[3392]: W0417 23:46:04.717394 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.718524 kubelet[3392]: E0417 23:46:04.717408 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.719939 kubelet[3392]: E0417 23:46:04.719922 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.719939 kubelet[3392]: W0417 23:46:04.719941 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.720130 kubelet[3392]: E0417 23:46:04.719955 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.720724 kubelet[3392]: E0417 23:46:04.720704 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.720724 kubelet[3392]: W0417 23:46:04.720724 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.720902 kubelet[3392]: E0417 23:46:04.720739 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.722355 kubelet[3392]: E0417 23:46:04.722338 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.722355 kubelet[3392]: W0417 23:46:04.722355 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.722530 kubelet[3392]: E0417 23:46:04.722370 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.725789 kubelet[3392]: E0417 23:46:04.725620 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.725789 kubelet[3392]: W0417 23:46:04.725637 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.725789 kubelet[3392]: E0417 23:46:04.725656 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.727152 kubelet[3392]: E0417 23:46:04.727134 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.728425 kubelet[3392]: W0417 23:46:04.728343 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.728425 kubelet[3392]: E0417 23:46:04.728369 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.729687 kubelet[3392]: E0417 23:46:04.729464 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.729687 kubelet[3392]: W0417 23:46:04.729492 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.729687 kubelet[3392]: E0417 23:46:04.729509 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.739182 kubelet[3392]: E0417 23:46:04.739159 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.739592 kubelet[3392]: W0417 23:46:04.739380 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.739592 kubelet[3392]: E0417 23:46:04.739411 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.739988 kubelet[3392]: E0417 23:46:04.739876 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.739988 kubelet[3392]: W0417 23:46:04.739891 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.739988 kubelet[3392]: E0417 23:46:04.739905 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.755251 kubelet[3392]: E0417 23:46:04.755218 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.755251 kubelet[3392]: W0417 23:46:04.755246 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.755954 kubelet[3392]: E0417 23:46:04.755271 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.755954 kubelet[3392]: I0417 23:46:04.755312 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4658a59c-dd8b-4cd9-bac7-09b6c58f7e83-socket-dir\") pod \"csi-node-driver-kdzpd\" (UID: \"4658a59c-dd8b-4cd9-bac7-09b6c58f7e83\") " pod="calico-system/csi-node-driver-kdzpd" Apr 17 23:46:04.761183 kubelet[3392]: E0417 23:46:04.759762 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.761183 kubelet[3392]: W0417 23:46:04.759786 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.761183 kubelet[3392]: E0417 23:46:04.759809 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.761183 kubelet[3392]: I0417 23:46:04.759845 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4658a59c-dd8b-4cd9-bac7-09b6c58f7e83-kubelet-dir\") pod \"csi-node-driver-kdzpd\" (UID: \"4658a59c-dd8b-4cd9-bac7-09b6c58f7e83\") " pod="calico-system/csi-node-driver-kdzpd" Apr 17 23:46:04.762981 kubelet[3392]: E0417 23:46:04.762128 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.762981 kubelet[3392]: W0417 23:46:04.762150 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.762981 kubelet[3392]: E0417 23:46:04.762172 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.765725 kubelet[3392]: E0417 23:46:04.765542 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.765725 kubelet[3392]: W0417 23:46:04.765561 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.765725 kubelet[3392]: E0417 23:46:04.765579 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.766010 kubelet[3392]: E0417 23:46:04.765908 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.766372 kubelet[3392]: W0417 23:46:04.766022 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.766372 kubelet[3392]: E0417 23:46:04.766043 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.766372 kubelet[3392]: E0417 23:46:04.766275 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.766372 kubelet[3392]: W0417 23:46:04.766286 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.766912 kubelet[3392]: E0417 23:46:04.766392 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.767044 kubelet[3392]: I0417 23:46:04.767016 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4658a59c-dd8b-4cd9-bac7-09b6c58f7e83-registration-dir\") pod \"csi-node-driver-kdzpd\" (UID: \"4658a59c-dd8b-4cd9-bac7-09b6c58f7e83\") " pod="calico-system/csi-node-driver-kdzpd" Apr 17 23:46:04.767135 kubelet[3392]: E0417 23:46:04.767121 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.767382 kubelet[3392]: W0417 23:46:04.767353 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.767382 kubelet[3392]: E0417 23:46:04.767377 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.770332 kubelet[3392]: E0417 23:46:04.770167 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.770332 kubelet[3392]: W0417 23:46:04.770183 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.770332 kubelet[3392]: E0417 23:46:04.770200 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.771767 kubelet[3392]: E0417 23:46:04.771631 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.772129 kubelet[3392]: W0417 23:46:04.771650 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.772129 kubelet[3392]: E0417 23:46:04.771845 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.772129 kubelet[3392]: I0417 23:46:04.771888 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/4658a59c-dd8b-4cd9-bac7-09b6c58f7e83-varrun\") pod \"csi-node-driver-kdzpd\" (UID: \"4658a59c-dd8b-4cd9-bac7-09b6c58f7e83\") " pod="calico-system/csi-node-driver-kdzpd" Apr 17 23:46:04.774138 kubelet[3392]: E0417 23:46:04.773564 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.774138 kubelet[3392]: W0417 23:46:04.773585 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.774138 kubelet[3392]: E0417 23:46:04.773601 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.774308 kubelet[3392]: E0417 23:46:04.774266 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.774308 kubelet[3392]: W0417 23:46:04.774278 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.774308 kubelet[3392]: E0417 23:46:04.774293 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.774434 kubelet[3392]: I0417 23:46:04.774324 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nqqh\" (UniqueName: \"kubernetes.io/projected/4658a59c-dd8b-4cd9-bac7-09b6c58f7e83-kube-api-access-4nqqh\") pod \"csi-node-driver-kdzpd\" (UID: \"4658a59c-dd8b-4cd9-bac7-09b6c58f7e83\") " pod="calico-system/csi-node-driver-kdzpd" Apr 17 23:46:04.776410 kubelet[3392]: E0417 23:46:04.776300 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.776410 kubelet[3392]: W0417 23:46:04.776318 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.776410 kubelet[3392]: E0417 23:46:04.776333 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.777695 kubelet[3392]: E0417 23:46:04.777677 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.777695 kubelet[3392]: W0417 23:46:04.777694 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.778853 kubelet[3392]: E0417 23:46:04.777709 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.778853 kubelet[3392]: E0417 23:46:04.778632 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.778853 kubelet[3392]: W0417 23:46:04.778645 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.778853 kubelet[3392]: E0417 23:46:04.778659 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.782014 kubelet[3392]: E0417 23:46:04.781660 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.782014 kubelet[3392]: W0417 23:46:04.781675 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.782014 kubelet[3392]: E0417 23:46:04.781688 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.782014 kubelet[3392]: E0417 23:46:04.781895 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.782014 kubelet[3392]: W0417 23:46:04.781905 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.782014 kubelet[3392]: E0417 23:46:04.781917 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.789176 containerd[1834]: time="2026-04-17T23:46:04.785875270Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:46:04.789176 containerd[1834]: time="2026-04-17T23:46:04.785934570Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:46:04.789176 containerd[1834]: time="2026-04-17T23:46:04.785949870Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:04.789176 containerd[1834]: time="2026-04-17T23:46:04.786043571Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:04.842648 kubelet[3392]: E0417 23:46:04.842447 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.842648 kubelet[3392]: W0417 23:46:04.842497 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.842648 kubelet[3392]: E0417 23:46:04.842530 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.843269 kubelet[3392]: E0417 23:46:04.843116 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.843269 kubelet[3392]: W0417 23:46:04.843136 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.843269 kubelet[3392]: E0417 23:46:04.843158 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.843633 kubelet[3392]: E0417 23:46:04.843416 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.843633 kubelet[3392]: W0417 23:46:04.843428 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.843633 kubelet[3392]: E0417 23:46:04.843441 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.844147 kubelet[3392]: E0417 23:46:04.843986 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.844147 kubelet[3392]: W0417 23:46:04.844000 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.844147 kubelet[3392]: E0417 23:46:04.844017 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.844458 kubelet[3392]: E0417 23:46:04.844339 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.844458 kubelet[3392]: W0417 23:46:04.844353 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.844458 kubelet[3392]: E0417 23:46:04.844366 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.844794 kubelet[3392]: E0417 23:46:04.844778 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.844873 kubelet[3392]: W0417 23:46:04.844797 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.844873 kubelet[3392]: E0417 23:46:04.844812 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.845140 kubelet[3392]: E0417 23:46:04.845128 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.845320 kubelet[3392]: W0417 23:46:04.845228 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.845320 kubelet[3392]: E0417 23:46:04.845247 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.845715 kubelet[3392]: E0417 23:46:04.845603 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.845715 kubelet[3392]: W0417 23:46:04.845615 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.845715 kubelet[3392]: E0417 23:46:04.845628 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.846096 kubelet[3392]: E0417 23:46:04.845990 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.846096 kubelet[3392]: W0417 23:46:04.846002 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.846096 kubelet[3392]: E0417 23:46:04.846013 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.846672 kubelet[3392]: E0417 23:46:04.846319 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.846672 kubelet[3392]: W0417 23:46:04.846331 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.846672 kubelet[3392]: E0417 23:46:04.846342 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.847195 kubelet[3392]: E0417 23:46:04.847064 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.847195 kubelet[3392]: W0417 23:46:04.847086 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.847195 kubelet[3392]: E0417 23:46:04.847099 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.847671 kubelet[3392]: E0417 23:46:04.847592 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.847671 kubelet[3392]: W0417 23:46:04.847605 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.847671 kubelet[3392]: E0417 23:46:04.847618 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.848113 kubelet[3392]: E0417 23:46:04.847984 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.848113 kubelet[3392]: W0417 23:46:04.847997 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.848113 kubelet[3392]: E0417 23:46:04.848010 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.848948 kubelet[3392]: E0417 23:46:04.848569 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.848948 kubelet[3392]: W0417 23:46:04.848584 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.848948 kubelet[3392]: E0417 23:46:04.848597 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.849651 kubelet[3392]: E0417 23:46:04.849400 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.849651 kubelet[3392]: W0417 23:46:04.849413 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.849651 kubelet[3392]: E0417 23:46:04.849426 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.849922 kubelet[3392]: E0417 23:46:04.849847 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.849922 kubelet[3392]: W0417 23:46:04.849859 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.849922 kubelet[3392]: E0417 23:46:04.849871 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.850288 kubelet[3392]: E0417 23:46:04.850191 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.850288 kubelet[3392]: W0417 23:46:04.850202 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.850288 kubelet[3392]: E0417 23:46:04.850214 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.850604 kubelet[3392]: E0417 23:46:04.850395 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.850604 kubelet[3392]: W0417 23:46:04.850407 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.850604 kubelet[3392]: E0417 23:46:04.850418 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.850866 kubelet[3392]: E0417 23:46:04.850856 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.850989 kubelet[3392]: W0417 23:46:04.850932 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.850989 kubelet[3392]: E0417 23:46:04.850948 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.851273 kubelet[3392]: E0417 23:46:04.851210 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.851273 kubelet[3392]: W0417 23:46:04.851220 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.851273 kubelet[3392]: E0417 23:46:04.851231 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.851628 kubelet[3392]: E0417 23:46:04.851568 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.851628 kubelet[3392]: W0417 23:46:04.851580 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.851628 kubelet[3392]: E0417 23:46:04.851591 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.852044 kubelet[3392]: E0417 23:46:04.851948 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.852044 kubelet[3392]: W0417 23:46:04.851959 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.852044 kubelet[3392]: E0417 23:46:04.851970 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.852307 kubelet[3392]: E0417 23:46:04.852245 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.852307 kubelet[3392]: W0417 23:46:04.852256 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.852307 kubelet[3392]: E0417 23:46:04.852267 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.852682 kubelet[3392]: E0417 23:46:04.852611 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.852682 kubelet[3392]: W0417 23:46:04.852623 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.852682 kubelet[3392]: E0417 23:46:04.852634 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.853138 kubelet[3392]: E0417 23:46:04.852979 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.853138 kubelet[3392]: W0417 23:46:04.852990 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.853138 kubelet[3392]: E0417 23:46:04.853003 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.853421 kubelet[3392]: E0417 23:46:04.853317 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.853421 kubelet[3392]: W0417 23:46:04.853328 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.853421 kubelet[3392]: E0417 23:46:04.853339 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.853905 kubelet[3392]: E0417 23:46:04.853770 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.853905 kubelet[3392]: W0417 23:46:04.853782 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.853905 kubelet[3392]: E0417 23:46:04.853794 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.854239 kubelet[3392]: E0417 23:46:04.854138 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.854239 kubelet[3392]: W0417 23:46:04.854165 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.854239 kubelet[3392]: E0417 23:46:04.854178 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.854684 kubelet[3392]: E0417 23:46:04.854603 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.854684 kubelet[3392]: W0417 23:46:04.854618 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.854684 kubelet[3392]: E0417 23:46:04.854638 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.855140 kubelet[3392]: E0417 23:46:04.855059 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.855140 kubelet[3392]: W0417 23:46:04.855072 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.855140 kubelet[3392]: E0417 23:46:04.855090 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.855629 kubelet[3392]: E0417 23:46:04.855523 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.855629 kubelet[3392]: W0417 23:46:04.855535 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.855629 kubelet[3392]: E0417 23:46:04.855547 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.856449 kubelet[3392]: E0417 23:46:04.856431 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.856449 kubelet[3392]: W0417 23:46:04.856446 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.856589 kubelet[3392]: E0417 23:46:04.856561 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.857005 kubelet[3392]: E0417 23:46:04.856991 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.857005 kubelet[3392]: W0417 23:46:04.857005 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.857349 kubelet[3392]: E0417 23:46:04.857019 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.857590 kubelet[3392]: E0417 23:46:04.857516 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.857590 kubelet[3392]: W0417 23:46:04.857529 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.857590 kubelet[3392]: E0417 23:46:04.857545 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.858095 kubelet[3392]: E0417 23:46:04.857958 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.858095 kubelet[3392]: W0417 23:46:04.857973 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.858095 kubelet[3392]: E0417 23:46:04.857987 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.858538 kubelet[3392]: E0417 23:46:04.858391 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.858538 kubelet[3392]: W0417 23:46:04.858406 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.858538 kubelet[3392]: E0417 23:46:04.858420 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.858933 kubelet[3392]: E0417 23:46:04.858863 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.858933 kubelet[3392]: W0417 23:46:04.858878 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.858933 kubelet[3392]: E0417 23:46:04.858891 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.859451 kubelet[3392]: E0417 23:46:04.859362 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.859451 kubelet[3392]: W0417 23:46:04.859377 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.859451 kubelet[3392]: E0417 23:46:04.859407 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.860016 kubelet[3392]: E0417 23:46:04.859985 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.860209 kubelet[3392]: W0417 23:46:04.860105 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.860209 kubelet[3392]: E0417 23:46:04.860123 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.860660 kubelet[3392]: E0417 23:46:04.860524 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.860660 kubelet[3392]: W0417 23:46:04.860542 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.860660 kubelet[3392]: E0417 23:46:04.860555 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.861300 kubelet[3392]: E0417 23:46:04.861156 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.861300 kubelet[3392]: W0417 23:46:04.861169 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.861300 kubelet[3392]: E0417 23:46:04.861183 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.862112 kubelet[3392]: E0417 23:46:04.861994 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.862112 kubelet[3392]: W0417 23:46:04.862007 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.862112 kubelet[3392]: E0417 23:46:04.862027 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.864494 kubelet[3392]: E0417 23:46:04.863946 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.864494 kubelet[3392]: W0417 23:46:04.863965 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.864494 kubelet[3392]: E0417 23:46:04.863981 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.864494 kubelet[3392]: E0417 23:46:04.864226 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.864494 kubelet[3392]: W0417 23:46:04.864238 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.864494 kubelet[3392]: E0417 23:46:04.864251 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.864494 kubelet[3392]: E0417 23:46:04.864475 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.864835 kubelet[3392]: W0417 23:46:04.864517 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.864835 kubelet[3392]: E0417 23:46:04.864530 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.864835 kubelet[3392]: E0417 23:46:04.864730 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.864835 kubelet[3392]: W0417 23:46:04.864739 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.864835 kubelet[3392]: E0417 23:46:04.864751 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.865067 kubelet[3392]: E0417 23:46:04.864934 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.865067 kubelet[3392]: W0417 23:46:04.864944 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.865067 kubelet[3392]: E0417 23:46:04.864955 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.865190 kubelet[3392]: E0417 23:46:04.865173 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.865190 kubelet[3392]: W0417 23:46:04.865183 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.865272 kubelet[3392]: E0417 23:46:04.865197 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.866509 kubelet[3392]: E0417 23:46:04.865397 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.866509 kubelet[3392]: W0417 23:46:04.865409 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.866509 kubelet[3392]: E0417 23:46:04.865420 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.866509 kubelet[3392]: E0417 23:46:04.865634 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.866509 kubelet[3392]: W0417 23:46:04.865664 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.866509 kubelet[3392]: E0417 23:46:04.865678 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.866509 kubelet[3392]: E0417 23:46:04.865869 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.866509 kubelet[3392]: W0417 23:46:04.865881 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.866509 kubelet[3392]: E0417 23:46:04.865893 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.866509 kubelet[3392]: E0417 23:46:04.866100 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.866927 kubelet[3392]: W0417 23:46:04.866112 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.866927 kubelet[3392]: E0417 23:46:04.866124 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.868222 kubelet[3392]: E0417 23:46:04.867569 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.868222 kubelet[3392]: W0417 23:46:04.867583 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.868222 kubelet[3392]: E0417 23:46:04.867596 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.868222 kubelet[3392]: E0417 23:46:04.868142 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.868222 kubelet[3392]: W0417 23:46:04.868156 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.868222 kubelet[3392]: E0417 23:46:04.868171 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.869511 kubelet[3392]: E0417 23:46:04.869330 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.869511 kubelet[3392]: W0417 23:46:04.869345 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.869511 kubelet[3392]: E0417 23:46:04.869358 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.870066 kubelet[3392]: E0417 23:46:04.869904 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.870066 kubelet[3392]: W0417 23:46:04.869920 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.870066 kubelet[3392]: E0417 23:46:04.869933 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.870223 kubelet[3392]: E0417 23:46:04.870163 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.870223 kubelet[3392]: W0417 23:46:04.870173 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.870223 kubelet[3392]: E0417 23:46:04.870186 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.870550 kubelet[3392]: E0417 23:46:04.870368 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.870550 kubelet[3392]: W0417 23:46:04.870381 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.870550 kubelet[3392]: E0417 23:46:04.870393 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.870836 kubelet[3392]: E0417 23:46:04.870635 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.870836 kubelet[3392]: W0417 23:46:04.870647 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.870836 kubelet[3392]: E0417 23:46:04.870660 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.871439 kubelet[3392]: E0417 23:46:04.871153 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.871439 kubelet[3392]: W0417 23:46:04.871165 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.871439 kubelet[3392]: E0417 23:46:04.871178 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.871439 kubelet[3392]: E0417 23:46:04.871385 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.871439 kubelet[3392]: W0417 23:46:04.871396 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.871439 kubelet[3392]: E0417 23:46:04.871408 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.871726 kubelet[3392]: E0417 23:46:04.871625 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.871726 kubelet[3392]: W0417 23:46:04.871637 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.871726 kubelet[3392]: E0417 23:46:04.871649 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.871862 kubelet[3392]: E0417 23:46:04.871836 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.871862 kubelet[3392]: W0417 23:46:04.871845 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.871862 kubelet[3392]: E0417 23:46:04.871856 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.873556 kubelet[3392]: E0417 23:46:04.872043 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.873556 kubelet[3392]: W0417 23:46:04.872055 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.873556 kubelet[3392]: E0417 23:46:04.872066 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.873556 kubelet[3392]: E0417 23:46:04.872264 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.873556 kubelet[3392]: W0417 23:46:04.872275 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.873556 kubelet[3392]: E0417 23:46:04.872286 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.873556 kubelet[3392]: E0417 23:46:04.872500 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.873556 kubelet[3392]: W0417 23:46:04.872511 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.873556 kubelet[3392]: E0417 23:46:04.872522 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.873556 kubelet[3392]: E0417 23:46:04.872720 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.874803 kubelet[3392]: W0417 23:46:04.872730 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.874803 kubelet[3392]: E0417 23:46:04.872742 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.874803 kubelet[3392]: E0417 23:46:04.872941 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.874803 kubelet[3392]: W0417 23:46:04.872951 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.874803 kubelet[3392]: E0417 23:46:04.872964 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.874803 kubelet[3392]: E0417 23:46:04.873194 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.874803 kubelet[3392]: W0417 23:46:04.873206 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.874803 kubelet[3392]: E0417 23:46:04.873218 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.874803 kubelet[3392]: E0417 23:46:04.873731 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.874803 kubelet[3392]: W0417 23:46:04.873742 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.875094 kubelet[3392]: E0417 23:46:04.873755 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.875094 kubelet[3392]: E0417 23:46:04.874323 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.875094 kubelet[3392]: W0417 23:46:04.874339 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.875094 kubelet[3392]: E0417 23:46:04.874352 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.875094 kubelet[3392]: E0417 23:46:04.874617 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.875094 kubelet[3392]: W0417 23:46:04.874628 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.875094 kubelet[3392]: E0417 23:46:04.874641 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.875094 kubelet[3392]: E0417 23:46:04.874829 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.875094 kubelet[3392]: W0417 23:46:04.874841 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.875094 kubelet[3392]: E0417 23:46:04.874853 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.876404 kubelet[3392]: E0417 23:46:04.876041 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.876404 kubelet[3392]: W0417 23:46:04.876057 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.876404 kubelet[3392]: E0417 23:46:04.876088 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.878900 containerd[1834]: time="2026-04-17T23:46:04.878862487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8645c4df75-j2m2k,Uid:34ecdd83-e0ab-4599-a5d5-5cfa4fb300cd,Namespace:calico-system,Attempt:0,} returns sandbox id \"3681fcca9abb4631e1b82536840d9ede34ac2339ecdea9aebcf12eb2c4cf93cc\"" Apr 17 23:46:04.879622 kubelet[3392]: E0417 23:46:04.879474 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.879622 kubelet[3392]: W0417 23:46:04.879562 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.879622 kubelet[3392]: E0417 23:46:04.879578 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.879952 kubelet[3392]: E0417 23:46:04.879871 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.879952 kubelet[3392]: W0417 23:46:04.879882 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.879952 kubelet[3392]: E0417 23:46:04.879897 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.880296 kubelet[3392]: E0417 23:46:04.880142 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.880296 kubelet[3392]: W0417 23:46:04.880152 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.880296 kubelet[3392]: E0417 23:46:04.880198 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.880574 kubelet[3392]: E0417 23:46:04.880510 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.880574 kubelet[3392]: W0417 23:46:04.880525 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.880574 kubelet[3392]: E0417 23:46:04.880538 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.881076 kubelet[3392]: E0417 23:46:04.880958 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.881076 kubelet[3392]: W0417 23:46:04.880971 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.881076 kubelet[3392]: E0417 23:46:04.880984 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.881427 kubelet[3392]: E0417 23:46:04.881374 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.881427 kubelet[3392]: W0417 23:46:04.881387 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.881427 kubelet[3392]: E0417 23:46:04.881401 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.882539 containerd[1834]: time="2026-04-17T23:46:04.882249420Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Apr 17 23:46:04.882776 kubelet[3392]: E0417 23:46:04.882763 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.882893 kubelet[3392]: W0417 23:46:04.882880 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.882975 kubelet[3392]: E0417 23:46:04.882964 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.883326 kubelet[3392]: E0417 23:46:04.883312 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.883394 kubelet[3392]: W0417 23:46:04.883339 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.883394 kubelet[3392]: E0417 23:46:04.883357 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.883724 kubelet[3392]: E0417 23:46:04.883707 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.883789 kubelet[3392]: W0417 23:46:04.883727 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.883789 kubelet[3392]: E0417 23:46:04.883741 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.884050 kubelet[3392]: E0417 23:46:04.884028 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.884050 kubelet[3392]: W0417 23:46:04.884048 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.884191 kubelet[3392]: E0417 23:46:04.884061 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.884379 kubelet[3392]: E0417 23:46:04.884364 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.884379 kubelet[3392]: W0417 23:46:04.884378 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.884565 kubelet[3392]: E0417 23:46:04.884400 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.884884 kubelet[3392]: E0417 23:46:04.884653 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.884884 kubelet[3392]: W0417 23:46:04.884680 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.884884 kubelet[3392]: E0417 23:46:04.884693 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.885034 kubelet[3392]: E0417 23:46:04.884949 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.885034 kubelet[3392]: W0417 23:46:04.884973 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.885034 kubelet[3392]: E0417 23:46:04.884986 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.885287 kubelet[3392]: E0417 23:46:04.885270 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.885287 kubelet[3392]: W0417 23:46:04.885287 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.885505 kubelet[3392]: E0417 23:46:04.885301 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.885598 kubelet[3392]: E0417 23:46:04.885583 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.885598 kubelet[3392]: W0417 23:46:04.885596 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.885821 kubelet[3392]: E0417 23:46:04.885611 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.885867 kubelet[3392]: E0417 23:46:04.885855 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.885925 kubelet[3392]: W0417 23:46:04.885865 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.885925 kubelet[3392]: E0417 23:46:04.885878 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.886148 kubelet[3392]: E0417 23:46:04.886130 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.886148 kubelet[3392]: W0417 23:46:04.886146 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.886247 kubelet[3392]: E0417 23:46:04.886161 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.886541 kubelet[3392]: E0417 23:46:04.886525 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.886605 kubelet[3392]: W0417 23:46:04.886546 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.886605 kubelet[3392]: E0417 23:46:04.886560 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.886840 kubelet[3392]: E0417 23:46:04.886822 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.886840 kubelet[3392]: W0417 23:46:04.886834 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.886962 kubelet[3392]: E0417 23:46:04.886847 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.887116 kubelet[3392]: E0417 23:46:04.887088 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.887116 kubelet[3392]: W0417 23:46:04.887105 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.887354 kubelet[3392]: E0417 23:46:04.887120 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.887403 kubelet[3392]: E0417 23:46:04.887367 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.887475 kubelet[3392]: W0417 23:46:04.887456 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.887475 kubelet[3392]: E0417 23:46:04.887476 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.887734 kubelet[3392]: E0417 23:46:04.887715 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.887734 kubelet[3392]: W0417 23:46:04.887731 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.887843 kubelet[3392]: E0417 23:46:04.887744 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.887984 kubelet[3392]: E0417 23:46:04.887969 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.887984 kubelet[3392]: W0417 23:46:04.887983 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.888083 kubelet[3392]: E0417 23:46:04.887997 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.888241 kubelet[3392]: E0417 23:46:04.888225 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.888241 kubelet[3392]: W0417 23:46:04.888239 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.888374 kubelet[3392]: E0417 23:46:04.888252 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.888568 kubelet[3392]: E0417 23:46:04.888550 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.888568 kubelet[3392]: W0417 23:46:04.888566 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.888696 kubelet[3392]: E0417 23:46:04.888580 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.888818 kubelet[3392]: E0417 23:46:04.888801 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.888818 kubelet[3392]: W0417 23:46:04.888815 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.889049 kubelet[3392]: E0417 23:46:04.888829 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.889404 kubelet[3392]: E0417 23:46:04.889356 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.889404 kubelet[3392]: W0417 23:46:04.889369 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.889404 kubelet[3392]: E0417 23:46:04.889382 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.890093 kubelet[3392]: E0417 23:46:04.889765 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.890093 kubelet[3392]: W0417 23:46:04.889779 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.890093 kubelet[3392]: E0417 23:46:04.889792 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.891434 kubelet[3392]: E0417 23:46:04.890653 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.891434 kubelet[3392]: W0417 23:46:04.890667 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.891434 kubelet[3392]: E0417 23:46:04.890679 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.891434 kubelet[3392]: E0417 23:46:04.890869 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.891434 kubelet[3392]: W0417 23:46:04.890877 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.891434 kubelet[3392]: E0417 23:46:04.890888 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.891434 kubelet[3392]: E0417 23:46:04.891048 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.891434 kubelet[3392]: W0417 23:46:04.891058 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.891434 kubelet[3392]: E0417 23:46:04.891068 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.891434 kubelet[3392]: E0417 23:46:04.891265 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.891957 kubelet[3392]: W0417 23:46:04.891274 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.891957 kubelet[3392]: E0417 23:46:04.891284 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.892128 kubelet[3392]: E0417 23:46:04.892018 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.892128 kubelet[3392]: W0417 23:46:04.892029 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.892128 kubelet[3392]: E0417 23:46:04.892042 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.892411 kubelet[3392]: E0417 23:46:04.892394 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.892541 kubelet[3392]: W0417 23:46:04.892409 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.892541 kubelet[3392]: E0417 23:46:04.892431 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.892746 kubelet[3392]: E0417 23:46:04.892734 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.892810 kubelet[3392]: W0417 23:46:04.892793 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.892908 kubelet[3392]: E0417 23:46:04.892811 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.893097 kubelet[3392]: E0417 23:46:04.893083 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.893180 kubelet[3392]: W0417 23:46:04.893159 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.893243 kubelet[3392]: E0417 23:46:04.893182 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.893452 kubelet[3392]: E0417 23:46:04.893436 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.893452 kubelet[3392]: W0417 23:46:04.893451 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.893577 kubelet[3392]: E0417 23:46:04.893472 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.893766 kubelet[3392]: E0417 23:46:04.893748 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.893766 kubelet[3392]: W0417 23:46:04.893763 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.893896 kubelet[3392]: E0417 23:46:04.893777 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.906841 kubelet[3392]: E0417 23:46:04.906821 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.906841 kubelet[3392]: W0417 23:46:04.906837 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.906972 kubelet[3392]: E0417 23:46:04.906862 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:04.989834 kubelet[3392]: E0417 23:46:04.989593 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:04.989834 kubelet[3392]: W0417 23:46:04.989651 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:04.989834 kubelet[3392]: E0417 23:46:04.989674 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:05.090668 kubelet[3392]: E0417 23:46:05.090633 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:05.090668 kubelet[3392]: W0417 23:46:05.090660 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:05.090889 kubelet[3392]: E0417 23:46:05.090685 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:05.091075 kubelet[3392]: E0417 23:46:05.091053 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:05.091075 kubelet[3392]: W0417 23:46:05.091071 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:05.091295 kubelet[3392]: E0417 23:46:05.091088 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:05.091361 kubelet[3392]: E0417 23:46:05.091345 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:05.091414 kubelet[3392]: W0417 23:46:05.091361 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:05.091414 kubelet[3392]: E0417 23:46:05.091377 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:05.091642 kubelet[3392]: E0417 23:46:05.091625 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:05.091642 kubelet[3392]: W0417 23:46:05.091641 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:05.091789 kubelet[3392]: E0417 23:46:05.091655 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:05.091939 kubelet[3392]: E0417 23:46:05.091922 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:05.092003 kubelet[3392]: W0417 23:46:05.091939 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:05.092003 kubelet[3392]: E0417 23:46:05.091953 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:05.095031 kubelet[3392]: E0417 23:46:05.095009 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:05.095031 kubelet[3392]: W0417 23:46:05.095025 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:05.095031 kubelet[3392]: E0417 23:46:05.095040 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:05.292948 containerd[1834]: time="2026-04-17T23:46:05.292909672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n24kb,Uid:b605ae21-564c-48fc-9c81-ef139844d6f8,Namespace:calico-system,Attempt:0,}" Apr 17 23:46:05.347429 containerd[1834]: time="2026-04-17T23:46:05.347294108Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:46:05.347429 containerd[1834]: time="2026-04-17T23:46:05.347364009Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:46:05.347429 containerd[1834]: time="2026-04-17T23:46:05.347421210Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:05.347867 containerd[1834]: time="2026-04-17T23:46:05.347647412Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:05.382673 containerd[1834]: time="2026-04-17T23:46:05.382566856Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n24kb,Uid:b605ae21-564c-48fc-9c81-ef139844d6f8,Namespace:calico-system,Attempt:0,} returns sandbox id \"7f3041d7a249ecda35607655cdc6a51687e0e0d2b294ee6699b1b6e8d67db523\"" Apr 17 23:46:06.246693 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2956464920.mount: Deactivated successfully. Apr 17 23:46:06.373764 kubelet[3392]: E0417 23:46:06.372473 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:08.372661 kubelet[3392]: E0417 23:46:08.372546 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:10.371580 kubelet[3392]: E0417 23:46:10.371163 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:11.202016 containerd[1834]: time="2026-04-17T23:46:11.201972680Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:11.206280 containerd[1834]: time="2026-04-17T23:46:11.206216021Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Apr 17 23:46:11.209332 containerd[1834]: time="2026-04-17T23:46:11.209279251Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:11.222670 containerd[1834]: time="2026-04-17T23:46:11.222610380Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:11.223710 containerd[1834]: time="2026-04-17T23:46:11.223372688Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 6.341082767s" Apr 17 23:46:11.223710 containerd[1834]: time="2026-04-17T23:46:11.223414388Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Apr 17 23:46:11.225052 containerd[1834]: time="2026-04-17T23:46:11.224839702Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Apr 17 23:46:11.248609 containerd[1834]: time="2026-04-17T23:46:11.248564533Z" level=info msg="CreateContainer within sandbox \"3681fcca9abb4631e1b82536840d9ede34ac2339ecdea9aebcf12eb2c4cf93cc\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Apr 17 23:46:11.442383 containerd[1834]: time="2026-04-17T23:46:11.442333015Z" level=info msg="CreateContainer within sandbox \"3681fcca9abb4631e1b82536840d9ede34ac2339ecdea9aebcf12eb2c4cf93cc\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e5f8b0a3ae559669700fdfeeaef43bf1d0290ad928a4845dc03245068210bc72\"" Apr 17 23:46:11.443504 containerd[1834]: time="2026-04-17T23:46:11.443012922Z" level=info msg="StartContainer for \"e5f8b0a3ae559669700fdfeeaef43bf1d0290ad928a4845dc03245068210bc72\"" Apr 17 23:46:11.537406 containerd[1834]: time="2026-04-17T23:46:11.537280837Z" level=info msg="StartContainer for \"e5f8b0a3ae559669700fdfeeaef43bf1d0290ad928a4845dc03245068210bc72\" returns successfully" Apr 17 23:46:12.023524 kubelet[3392]: E0417 23:46:12.023390 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.023524 kubelet[3392]: W0417 23:46:12.023418 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.023524 kubelet[3392]: E0417 23:46:12.023443 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.024318 kubelet[3392]: E0417 23:46:12.024296 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.024318 kubelet[3392]: W0417 23:46:12.024315 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.024477 kubelet[3392]: E0417 23:46:12.024331 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.024603 kubelet[3392]: E0417 23:46:12.024587 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.024677 kubelet[3392]: W0417 23:46:12.024605 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.024677 kubelet[3392]: E0417 23:46:12.024619 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.024896 kubelet[3392]: E0417 23:46:12.024878 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.024896 kubelet[3392]: W0417 23:46:12.024894 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.024989 kubelet[3392]: E0417 23:46:12.024908 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.025148 kubelet[3392]: E0417 23:46:12.025130 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.025148 kubelet[3392]: W0417 23:46:12.025145 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.025280 kubelet[3392]: E0417 23:46:12.025159 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.025385 kubelet[3392]: E0417 23:46:12.025370 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.025385 kubelet[3392]: W0417 23:46:12.025383 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.025494 kubelet[3392]: E0417 23:46:12.025396 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.025620 kubelet[3392]: E0417 23:46:12.025604 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.025681 kubelet[3392]: W0417 23:46:12.025618 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.025681 kubelet[3392]: E0417 23:46:12.025632 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.025842 kubelet[3392]: E0417 23:46:12.025827 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.025842 kubelet[3392]: W0417 23:46:12.025840 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.025927 kubelet[3392]: E0417 23:46:12.025854 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.026082 kubelet[3392]: E0417 23:46:12.026066 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.026082 kubelet[3392]: W0417 23:46:12.026080 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.026212 kubelet[3392]: E0417 23:46:12.026094 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.026314 kubelet[3392]: E0417 23:46:12.026299 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.026314 kubelet[3392]: W0417 23:46:12.026312 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.026427 kubelet[3392]: E0417 23:46:12.026325 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.026567 kubelet[3392]: E0417 23:46:12.026550 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.026567 kubelet[3392]: W0417 23:46:12.026564 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.026685 kubelet[3392]: E0417 23:46:12.026579 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.026843 kubelet[3392]: E0417 23:46:12.026827 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.026843 kubelet[3392]: W0417 23:46:12.026841 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.026935 kubelet[3392]: E0417 23:46:12.026854 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.027082 kubelet[3392]: E0417 23:46:12.027064 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.027082 kubelet[3392]: W0417 23:46:12.027080 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.027205 kubelet[3392]: E0417 23:46:12.027093 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.027310 kubelet[3392]: E0417 23:46:12.027295 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.027310 kubelet[3392]: W0417 23:46:12.027308 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.027428 kubelet[3392]: E0417 23:46:12.027321 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.027607 kubelet[3392]: E0417 23:46:12.027580 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.027607 kubelet[3392]: W0417 23:46:12.027594 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.027729 kubelet[3392]: E0417 23:46:12.027618 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.041324 kubelet[3392]: E0417 23:46:12.041295 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.041324 kubelet[3392]: W0417 23:46:12.041317 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.041571 kubelet[3392]: E0417 23:46:12.041339 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.041677 kubelet[3392]: E0417 23:46:12.041661 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.041739 kubelet[3392]: W0417 23:46:12.041677 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.041739 kubelet[3392]: E0417 23:46:12.041692 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.041984 kubelet[3392]: E0417 23:46:12.041967 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.041984 kubelet[3392]: W0417 23:46:12.041981 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.042097 kubelet[3392]: E0417 23:46:12.041995 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.042295 kubelet[3392]: E0417 23:46:12.042275 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.042295 kubelet[3392]: W0417 23:46:12.042293 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.042467 kubelet[3392]: E0417 23:46:12.042308 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.042565 kubelet[3392]: E0417 23:46:12.042546 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.042611 kubelet[3392]: W0417 23:46:12.042565 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.042611 kubelet[3392]: E0417 23:46:12.042580 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.042796 kubelet[3392]: E0417 23:46:12.042781 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.042796 kubelet[3392]: W0417 23:46:12.042795 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.042948 kubelet[3392]: E0417 23:46:12.042808 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.043046 kubelet[3392]: E0417 23:46:12.043030 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.043046 kubelet[3392]: W0417 23:46:12.043044 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.043178 kubelet[3392]: E0417 23:46:12.043058 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.043294 kubelet[3392]: E0417 23:46:12.043277 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.043294 kubelet[3392]: W0417 23:46:12.043292 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.043400 kubelet[3392]: E0417 23:46:12.043306 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.043603 kubelet[3392]: E0417 23:46:12.043585 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.043603 kubelet[3392]: W0417 23:46:12.043600 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.043695 kubelet[3392]: E0417 23:46:12.043614 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.043997 kubelet[3392]: E0417 23:46:12.043981 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.043997 kubelet[3392]: W0417 23:46:12.043994 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.044150 kubelet[3392]: E0417 23:46:12.044008 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.044239 kubelet[3392]: E0417 23:46:12.044224 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.044239 kubelet[3392]: W0417 23:46:12.044237 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.044380 kubelet[3392]: E0417 23:46:12.044251 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.044461 kubelet[3392]: E0417 23:46:12.044442 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.044461 kubelet[3392]: W0417 23:46:12.044458 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.044599 kubelet[3392]: E0417 23:46:12.044472 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.044728 kubelet[3392]: E0417 23:46:12.044714 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.044728 kubelet[3392]: W0417 23:46:12.044727 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.045099 kubelet[3392]: E0417 23:46:12.044740 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.045099 kubelet[3392]: E0417 23:46:12.044938 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.045099 kubelet[3392]: W0417 23:46:12.044948 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.045099 kubelet[3392]: E0417 23:46:12.044962 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.045288 kubelet[3392]: E0417 23:46:12.045134 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.045288 kubelet[3392]: W0417 23:46:12.045143 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.045288 kubelet[3392]: E0417 23:46:12.045155 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.045407 kubelet[3392]: E0417 23:46:12.045369 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.045407 kubelet[3392]: W0417 23:46:12.045378 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.045407 kubelet[3392]: E0417 23:46:12.045389 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.045923 kubelet[3392]: E0417 23:46:12.045863 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.045923 kubelet[3392]: W0417 23:46:12.045880 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.045923 kubelet[3392]: E0417 23:46:12.045894 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.046142 kubelet[3392]: E0417 23:46:12.046123 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:12.046142 kubelet[3392]: W0417 23:46:12.046139 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:12.046219 kubelet[3392]: E0417 23:46:12.046152 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:12.372940 kubelet[3392]: E0417 23:46:12.371993 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:12.993976 kubelet[3392]: I0417 23:46:12.993943 3392 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 23:46:13.035173 kubelet[3392]: E0417 23:46:13.035130 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.035173 kubelet[3392]: W0417 23:46:13.035160 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.035955 kubelet[3392]: E0417 23:46:13.035189 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.035955 kubelet[3392]: E0417 23:46:13.035433 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.035955 kubelet[3392]: W0417 23:46:13.035446 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.035955 kubelet[3392]: E0417 23:46:13.035464 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.035955 kubelet[3392]: E0417 23:46:13.035709 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.035955 kubelet[3392]: W0417 23:46:13.035720 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.035955 kubelet[3392]: E0417 23:46:13.035733 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.035955 kubelet[3392]: E0417 23:46:13.035936 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.035955 kubelet[3392]: W0417 23:46:13.035947 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.036407 kubelet[3392]: E0417 23:46:13.035962 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.036407 kubelet[3392]: E0417 23:46:13.036163 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.036407 kubelet[3392]: W0417 23:46:13.036175 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.036407 kubelet[3392]: E0417 23:46:13.036187 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.036407 kubelet[3392]: E0417 23:46:13.036370 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.036407 kubelet[3392]: W0417 23:46:13.036380 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.036407 kubelet[3392]: E0417 23:46:13.036392 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.036778 kubelet[3392]: E0417 23:46:13.036622 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.036778 kubelet[3392]: W0417 23:46:13.036634 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.036778 kubelet[3392]: E0417 23:46:13.036646 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.036914 kubelet[3392]: E0417 23:46:13.036859 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.036914 kubelet[3392]: W0417 23:46:13.036869 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.036914 kubelet[3392]: E0417 23:46:13.036881 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.037094 kubelet[3392]: E0417 23:46:13.037077 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.037094 kubelet[3392]: W0417 23:46:13.037089 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.037215 kubelet[3392]: E0417 23:46:13.037102 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.037307 kubelet[3392]: E0417 23:46:13.037289 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.037307 kubelet[3392]: W0417 23:46:13.037303 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.037408 kubelet[3392]: E0417 23:46:13.037316 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.037540 kubelet[3392]: E0417 23:46:13.037524 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.037540 kubelet[3392]: W0417 23:46:13.037538 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.037646 kubelet[3392]: E0417 23:46:13.037552 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.037840 kubelet[3392]: E0417 23:46:13.037822 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.037840 kubelet[3392]: W0417 23:46:13.037837 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.037975 kubelet[3392]: E0417 23:46:13.037852 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.038083 kubelet[3392]: E0417 23:46:13.038068 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.038137 kubelet[3392]: W0417 23:46:13.038082 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.038137 kubelet[3392]: E0417 23:46:13.038095 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.038350 kubelet[3392]: E0417 23:46:13.038332 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.038350 kubelet[3392]: W0417 23:46:13.038347 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.038458 kubelet[3392]: E0417 23:46:13.038361 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.038651 kubelet[3392]: E0417 23:46:13.038633 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.038651 kubelet[3392]: W0417 23:46:13.038648 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.038753 kubelet[3392]: E0417 23:46:13.038663 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.049036 kubelet[3392]: E0417 23:46:13.049013 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.049036 kubelet[3392]: W0417 23:46:13.049032 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.049177 kubelet[3392]: E0417 23:46:13.049051 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.049363 kubelet[3392]: E0417 23:46:13.049343 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.049363 kubelet[3392]: W0417 23:46:13.049360 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.049493 kubelet[3392]: E0417 23:46:13.049376 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.049689 kubelet[3392]: E0417 23:46:13.049671 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.049689 kubelet[3392]: W0417 23:46:13.049686 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.049805 kubelet[3392]: E0417 23:46:13.049700 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.049990 kubelet[3392]: E0417 23:46:13.049974 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.049990 kubelet[3392]: W0417 23:46:13.049988 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.050173 kubelet[3392]: E0417 23:46:13.050002 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.050267 kubelet[3392]: E0417 23:46:13.050250 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.050267 kubelet[3392]: W0417 23:46:13.050264 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.050363 kubelet[3392]: E0417 23:46:13.050278 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.050538 kubelet[3392]: E0417 23:46:13.050520 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.050538 kubelet[3392]: W0417 23:46:13.050537 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.050658 kubelet[3392]: E0417 23:46:13.050552 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.050817 kubelet[3392]: E0417 23:46:13.050800 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.050817 kubelet[3392]: W0417 23:46:13.050814 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.050934 kubelet[3392]: E0417 23:46:13.050828 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.051086 kubelet[3392]: E0417 23:46:13.051069 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.051086 kubelet[3392]: W0417 23:46:13.051084 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.051208 kubelet[3392]: E0417 23:46:13.051097 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.051381 kubelet[3392]: E0417 23:46:13.051365 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.051381 kubelet[3392]: W0417 23:46:13.051381 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.051526 kubelet[3392]: E0417 23:46:13.051394 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.051973 kubelet[3392]: E0417 23:46:13.051952 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.051973 kubelet[3392]: W0417 23:46:13.051965 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.052068 kubelet[3392]: E0417 23:46:13.051982 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.052510 kubelet[3392]: E0417 23:46:13.052464 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.052510 kubelet[3392]: W0417 23:46:13.052504 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.052655 kubelet[3392]: E0417 23:46:13.052519 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.052750 kubelet[3392]: E0417 23:46:13.052732 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.052750 kubelet[3392]: W0417 23:46:13.052748 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.052950 kubelet[3392]: E0417 23:46:13.052761 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.053195 kubelet[3392]: E0417 23:46:13.053064 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.053195 kubelet[3392]: W0417 23:46:13.053074 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.053195 kubelet[3392]: E0417 23:46:13.053084 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.053394 kubelet[3392]: E0417 23:46:13.053374 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.053450 kubelet[3392]: W0417 23:46:13.053392 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.053450 kubelet[3392]: E0417 23:46:13.053423 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.053698 kubelet[3392]: E0417 23:46:13.053678 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.053698 kubelet[3392]: W0417 23:46:13.053694 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.053833 kubelet[3392]: E0417 23:46:13.053708 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.054178 kubelet[3392]: E0417 23:46:13.054162 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.054178 kubelet[3392]: W0417 23:46:13.054176 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.054323 kubelet[3392]: E0417 23:46:13.054189 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.054473 kubelet[3392]: E0417 23:46:13.054455 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.054581 kubelet[3392]: W0417 23:46:13.054473 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.054581 kubelet[3392]: E0417 23:46:13.054529 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.054928 kubelet[3392]: E0417 23:46:13.054910 3392 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 17 23:46:13.054928 kubelet[3392]: W0417 23:46:13.054924 3392 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 17 23:46:13.055021 kubelet[3392]: E0417 23:46:13.054938 3392 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 17 23:46:13.987611 containerd[1834]: time="2026-04-17T23:46:13.987560742Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:13.990976 containerd[1834]: time="2026-04-17T23:46:13.990496170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Apr 17 23:46:14.033521 containerd[1834]: time="2026-04-17T23:46:14.033410087Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:14.039510 containerd[1834]: time="2026-04-17T23:46:14.039401345Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:14.040822 containerd[1834]: time="2026-04-17T23:46:14.040182453Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 2.815307851s" Apr 17 23:46:14.040822 containerd[1834]: time="2026-04-17T23:46:14.040229253Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Apr 17 23:46:14.086935 containerd[1834]: time="2026-04-17T23:46:14.086850006Z" level=info msg="CreateContainer within sandbox \"7f3041d7a249ecda35607655cdc6a51687e0e0d2b294ee6699b1b6e8d67db523\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Apr 17 23:46:14.480334 kubelet[3392]: E0417 23:46:14.371114 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:14.549891 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2117994458.mount: Deactivated successfully. Apr 17 23:46:14.644994 containerd[1834]: time="2026-04-17T23:46:14.644943728Z" level=info msg="CreateContainer within sandbox \"7f3041d7a249ecda35607655cdc6a51687e0e0d2b294ee6699b1b6e8d67db523\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"4529962e4601f11664b60b5b70a0fdbdee012e76177c084acca9e90af3b73c9e\"" Apr 17 23:46:14.645649 containerd[1834]: time="2026-04-17T23:46:14.645588234Z" level=info msg="StartContainer for \"4529962e4601f11664b60b5b70a0fdbdee012e76177c084acca9e90af3b73c9e\"" Apr 17 23:46:14.717413 containerd[1834]: time="2026-04-17T23:46:14.717357232Z" level=info msg="StartContainer for \"4529962e4601f11664b60b5b70a0fdbdee012e76177c084acca9e90af3b73c9e\" returns successfully" Apr 17 23:46:14.746666 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4529962e4601f11664b60b5b70a0fdbdee012e76177c084acca9e90af3b73c9e-rootfs.mount: Deactivated successfully. Apr 17 23:46:15.017505 kubelet[3392]: I0417 23:46:15.017234 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8645c4df75-j2m2k" podStartSLOduration=5.674241959 podStartE2EDuration="12.017212545s" podCreationTimestamp="2026-04-17 23:46:03 +0000 UTC" firstStartedPulling="2026-04-17 23:46:04.881638414 +0000 UTC m=+18.607604456" lastFinishedPulling="2026-04-17 23:46:11.2246091 +0000 UTC m=+24.950575042" observedRunningTime="2026-04-17 23:46:12.00587869 +0000 UTC m=+25.731844632" watchObservedRunningTime="2026-04-17 23:46:15.017212545 +0000 UTC m=+28.743178587" Apr 17 23:46:15.162852 kubelet[3392]: I0417 23:46:15.162696 3392 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 23:46:16.371964 kubelet[3392]: E0417 23:46:16.371527 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:17.085392 containerd[1834]: time="2026-04-17T23:46:17.085333534Z" level=error msg="collecting metrics for 4529962e4601f11664b60b5b70a0fdbdee012e76177c084acca9e90af3b73c9e" error="cgroups: cgroup deleted: unknown" Apr 17 23:46:18.373158 kubelet[3392]: E0417 23:46:18.372040 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:19.186288 containerd[1834]: time="2026-04-17T23:46:19.186207201Z" level=info msg="shim disconnected" id=4529962e4601f11664b60b5b70a0fdbdee012e76177c084acca9e90af3b73c9e namespace=k8s.io Apr 17 23:46:19.186288 containerd[1834]: time="2026-04-17T23:46:19.186276401Z" level=warning msg="cleaning up after shim disconnected" id=4529962e4601f11664b60b5b70a0fdbdee012e76177c084acca9e90af3b73c9e namespace=k8s.io Apr 17 23:46:19.186288 containerd[1834]: time="2026-04-17T23:46:19.186288502Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 17 23:46:20.009509 containerd[1834]: time="2026-04-17T23:46:20.009326993Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Apr 17 23:46:20.372875 kubelet[3392]: E0417 23:46:20.372028 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:22.372202 kubelet[3392]: E0417 23:46:22.371398 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:24.372148 kubelet[3392]: E0417 23:46:24.371256 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:26.372626 kubelet[3392]: E0417 23:46:26.372557 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:27.912655 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3913193313.mount: Deactivated successfully. Apr 17 23:46:27.954614 containerd[1834]: time="2026-04-17T23:46:27.954559942Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:27.957753 containerd[1834]: time="2026-04-17T23:46:27.957670165Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Apr 17 23:46:27.961183 containerd[1834]: time="2026-04-17T23:46:27.961131690Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:27.968351 containerd[1834]: time="2026-04-17T23:46:27.968303842Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:27.969097 containerd[1834]: time="2026-04-17T23:46:27.968948147Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 7.959577254s" Apr 17 23:46:27.969097 containerd[1834]: time="2026-04-17T23:46:27.968987447Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Apr 17 23:46:27.977203 containerd[1834]: time="2026-04-17T23:46:27.977162307Z" level=info msg="CreateContainer within sandbox \"7f3041d7a249ecda35607655cdc6a51687e0e0d2b294ee6699b1b6e8d67db523\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Apr 17 23:46:28.015454 containerd[1834]: time="2026-04-17T23:46:28.015413587Z" level=info msg="CreateContainer within sandbox \"7f3041d7a249ecda35607655cdc6a51687e0e0d2b294ee6699b1b6e8d67db523\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"99e192691542ac69de5bb75a5ee05fa38f46214c66779fa2b2b8fa8e396c84a5\"" Apr 17 23:46:28.017269 containerd[1834]: time="2026-04-17T23:46:28.015934391Z" level=info msg="StartContainer for \"99e192691542ac69de5bb75a5ee05fa38f46214c66779fa2b2b8fa8e396c84a5\"" Apr 17 23:46:28.079801 containerd[1834]: time="2026-04-17T23:46:28.079745558Z" level=info msg="StartContainer for \"99e192691542ac69de5bb75a5ee05fa38f46214c66779fa2b2b8fa8e396c84a5\" returns successfully" Apr 17 23:46:28.372865 kubelet[3392]: E0417 23:46:28.372000 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:28.909566 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-99e192691542ac69de5bb75a5ee05fa38f46214c66779fa2b2b8fa8e396c84a5-rootfs.mount: Deactivated successfully. Apr 17 23:46:30.371961 kubelet[3392]: E0417 23:46:30.371044 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:31.402936 containerd[1834]: time="2026-04-17T23:46:31.402864347Z" level=info msg="shim disconnected" id=99e192691542ac69de5bb75a5ee05fa38f46214c66779fa2b2b8fa8e396c84a5 namespace=k8s.io Apr 17 23:46:31.402936 containerd[1834]: time="2026-04-17T23:46:31.402928847Z" level=warning msg="cleaning up after shim disconnected" id=99e192691542ac69de5bb75a5ee05fa38f46214c66779fa2b2b8fa8e396c84a5 namespace=k8s.io Apr 17 23:46:31.402936 containerd[1834]: time="2026-04-17T23:46:31.402940347Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 17 23:46:32.040521 containerd[1834]: time="2026-04-17T23:46:32.038640490Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Apr 17 23:46:32.371683 kubelet[3392]: E0417 23:46:32.371372 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:34.371602 kubelet[3392]: E0417 23:46:34.371548 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:36.371738 kubelet[3392]: E0417 23:46:36.371552 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:37.689002 containerd[1834]: time="2026-04-17T23:46:37.688950492Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:37.697368 containerd[1834]: time="2026-04-17T23:46:37.697181266Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Apr 17 23:46:37.703214 containerd[1834]: time="2026-04-17T23:46:37.703012519Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:37.713257 containerd[1834]: time="2026-04-17T23:46:37.713205311Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:37.714391 containerd[1834]: time="2026-04-17T23:46:37.714012819Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 5.673864026s" Apr 17 23:46:37.714391 containerd[1834]: time="2026-04-17T23:46:37.714050919Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Apr 17 23:46:37.724328 containerd[1834]: time="2026-04-17T23:46:37.724285612Z" level=info msg="CreateContainer within sandbox \"7f3041d7a249ecda35607655cdc6a51687e0e0d2b294ee6699b1b6e8d67db523\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Apr 17 23:46:37.766179 containerd[1834]: time="2026-04-17T23:46:37.766144391Z" level=info msg="CreateContainer within sandbox \"7f3041d7a249ecda35607655cdc6a51687e0e0d2b294ee6699b1b6e8d67db523\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9a16d149e6917ccd1f4cdc1aadec1fc6b0b22fa87cc1d8ef170c662aa40b2f66\"" Apr 17 23:46:37.766749 containerd[1834]: time="2026-04-17T23:46:37.766644895Z" level=info msg="StartContainer for \"9a16d149e6917ccd1f4cdc1aadec1fc6b0b22fa87cc1d8ef170c662aa40b2f66\"" Apr 17 23:46:37.829891 containerd[1834]: time="2026-04-17T23:46:37.829818767Z" level=info msg="StartContainer for \"9a16d149e6917ccd1f4cdc1aadec1fc6b0b22fa87cc1d8ef170c662aa40b2f66\" returns successfully" Apr 17 23:46:38.372269 kubelet[3392]: E0417 23:46:38.372147 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:40.372251 kubelet[3392]: E0417 23:46:40.371331 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:42.372622 kubelet[3392]: E0417 23:46:42.371765 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:44.372948 kubelet[3392]: E0417 23:46:44.372074 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:44.854531 containerd[1834]: time="2026-04-17T23:46:44.854415014Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 17 23:46:44.882057 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9a16d149e6917ccd1f4cdc1aadec1fc6b0b22fa87cc1d8ef170c662aa40b2f66-rootfs.mount: Deactivated successfully. Apr 17 23:46:44.921763 kubelet[3392]: I0417 23:46:44.921731 3392 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Apr 17 23:46:51.144183 containerd[1834]: time="2026-04-17T23:46:51.144113415Z" level=info msg="shim disconnected" id=9a16d149e6917ccd1f4cdc1aadec1fc6b0b22fa87cc1d8ef170c662aa40b2f66 namespace=k8s.io Apr 17 23:46:51.144183 containerd[1834]: time="2026-04-17T23:46:51.144182116Z" level=warning msg="cleaning up after shim disconnected" id=9a16d149e6917ccd1f4cdc1aadec1fc6b0b22fa87cc1d8ef170c662aa40b2f66 namespace=k8s.io Apr 17 23:46:51.144183 containerd[1834]: time="2026-04-17T23:46:51.144194616Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 17 23:46:51.159125 containerd[1834]: time="2026-04-17T23:46:51.156917113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdzpd,Uid:4658a59c-dd8b-4cd9-bac7-09b6c58f7e83,Namespace:calico-system,Attempt:0,}" Apr 17 23:46:51.208567 kubelet[3392]: I0417 23:46:51.208059 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f-tigera-ca-bundle\") pod \"calico-kube-controllers-7479bd7d89-zgl8b\" (UID: \"c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f\") " pod="calico-system/calico-kube-controllers-7479bd7d89-zgl8b" Apr 17 23:46:51.208567 kubelet[3392]: I0417 23:46:51.208125 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c94a196-84f1-4d7e-892d-1fd10da74241-config-volume\") pod \"coredns-674b8bbfcf-dq7vb\" (UID: \"4c94a196-84f1-4d7e-892d-1fd10da74241\") " pod="kube-system/coredns-674b8bbfcf-dq7vb" Apr 17 23:46:51.208567 kubelet[3392]: I0417 23:46:51.208154 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z57v\" (UniqueName: \"kubernetes.io/projected/4c94a196-84f1-4d7e-892d-1fd10da74241-kube-api-access-5z57v\") pod \"coredns-674b8bbfcf-dq7vb\" (UID: \"4c94a196-84f1-4d7e-892d-1fd10da74241\") " pod="kube-system/coredns-674b8bbfcf-dq7vb" Apr 17 23:46:51.208567 kubelet[3392]: I0417 23:46:51.208179 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8636a294-eef0-499d-a578-9b4ce7de9cb5-calico-apiserver-certs\") pod \"calico-apiserver-568984f78-z9wh9\" (UID: \"8636a294-eef0-499d-a578-9b4ce7de9cb5\") " pod="calico-system/calico-apiserver-568984f78-z9wh9" Apr 17 23:46:51.208567 kubelet[3392]: I0417 23:46:51.208243 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nchf\" (UniqueName: \"kubernetes.io/projected/499d7392-3858-4796-8ae5-2602f95a1e6c-kube-api-access-2nchf\") pod \"calico-apiserver-568984f78-qbl55\" (UID: \"499d7392-3858-4796-8ae5-2602f95a1e6c\") " pod="calico-system/calico-apiserver-568984f78-qbl55" Apr 17 23:46:51.211212 kubelet[3392]: I0417 23:46:51.208269 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhc6\" (UniqueName: \"kubernetes.io/projected/0607b5b7-e84f-4371-964e-db63a77e29d1-kube-api-access-dbhc6\") pod \"coredns-674b8bbfcf-n7zwr\" (UID: \"0607b5b7-e84f-4371-964e-db63a77e29d1\") " pod="kube-system/coredns-674b8bbfcf-n7zwr" Apr 17 23:46:51.211212 kubelet[3392]: I0417 23:46:51.208291 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnhtc\" (UniqueName: \"kubernetes.io/projected/8636a294-eef0-499d-a578-9b4ce7de9cb5-kube-api-access-bnhtc\") pod \"calico-apiserver-568984f78-z9wh9\" (UID: \"8636a294-eef0-499d-a578-9b4ce7de9cb5\") " pod="calico-system/calico-apiserver-568984f78-z9wh9" Apr 17 23:46:51.211212 kubelet[3392]: I0417 23:46:51.208313 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/499d7392-3858-4796-8ae5-2602f95a1e6c-calico-apiserver-certs\") pod \"calico-apiserver-568984f78-qbl55\" (UID: \"499d7392-3858-4796-8ae5-2602f95a1e6c\") " pod="calico-system/calico-apiserver-568984f78-qbl55" Apr 17 23:46:51.211212 kubelet[3392]: I0417 23:46:51.208339 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0607b5b7-e84f-4371-964e-db63a77e29d1-config-volume\") pod \"coredns-674b8bbfcf-n7zwr\" (UID: \"0607b5b7-e84f-4371-964e-db63a77e29d1\") " pod="kube-system/coredns-674b8bbfcf-n7zwr" Apr 17 23:46:51.211212 kubelet[3392]: I0417 23:46:51.208363 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89zsv\" (UniqueName: \"kubernetes.io/projected/c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f-kube-api-access-89zsv\") pod \"calico-kube-controllers-7479bd7d89-zgl8b\" (UID: \"c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f\") " pod="calico-system/calico-kube-controllers-7479bd7d89-zgl8b" Apr 17 23:46:51.293217 containerd[1834]: time="2026-04-17T23:46:51.291308389Z" level=error msg="Failed to destroy network for sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:51.293217 containerd[1834]: time="2026-04-17T23:46:51.292601209Z" level=error msg="encountered an error cleaning up failed sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:51.293217 containerd[1834]: time="2026-04-17T23:46:51.292855813Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdzpd,Uid:4658a59c-dd8b-4cd9-bac7-09b6c58f7e83,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:51.295771 kubelet[3392]: E0417 23:46:51.293398 3392 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:51.295771 kubelet[3392]: E0417 23:46:51.293497 3392 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kdzpd" Apr 17 23:46:51.295771 kubelet[3392]: E0417 23:46:51.293555 3392 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kdzpd" Apr 17 23:46:51.295922 kubelet[3392]: E0417 23:46:51.293614 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kdzpd_calico-system(4658a59c-dd8b-4cd9-bac7-09b6c58f7e83)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kdzpd_calico-system(4658a59c-dd8b-4cd9-bac7-09b6c58f7e83)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:51.298547 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b-shm.mount: Deactivated successfully. Apr 17 23:46:51.311953 kubelet[3392]: I0417 23:46:51.309084 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c2d4922-a2ca-483d-b7db-77418f884573-config\") pod \"goldmane-5b85766d88-d6rv9\" (UID: \"4c2d4922-a2ca-483d-b7db-77418f884573\") " pod="calico-system/goldmane-5b85766d88-d6rv9" Apr 17 23:46:51.311953 kubelet[3392]: I0417 23:46:51.309172 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c2d4922-a2ca-483d-b7db-77418f884573-goldmane-ca-bundle\") pod \"goldmane-5b85766d88-d6rv9\" (UID: \"4c2d4922-a2ca-483d-b7db-77418f884573\") " pod="calico-system/goldmane-5b85766d88-d6rv9" Apr 17 23:46:51.311953 kubelet[3392]: I0417 23:46:51.309221 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96443d8b-6798-4ea2-b154-8b1611207513-whisker-ca-bundle\") pod \"whisker-655df5d5f-wr76l\" (UID: \"96443d8b-6798-4ea2-b154-8b1611207513\") " pod="calico-system/whisker-655df5d5f-wr76l" Apr 17 23:46:51.311953 kubelet[3392]: I0417 23:46:51.309264 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/4c2d4922-a2ca-483d-b7db-77418f884573-goldmane-key-pair\") pod \"goldmane-5b85766d88-d6rv9\" (UID: \"4c2d4922-a2ca-483d-b7db-77418f884573\") " pod="calico-system/goldmane-5b85766d88-d6rv9" Apr 17 23:46:51.311953 kubelet[3392]: I0417 23:46:51.309317 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6twdv\" (UniqueName: \"kubernetes.io/projected/4c2d4922-a2ca-483d-b7db-77418f884573-kube-api-access-6twdv\") pod \"goldmane-5b85766d88-d6rv9\" (UID: \"4c2d4922-a2ca-483d-b7db-77418f884573\") " pod="calico-system/goldmane-5b85766d88-d6rv9" Apr 17 23:46:51.314743 kubelet[3392]: I0417 23:46:51.309369 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/96443d8b-6798-4ea2-b154-8b1611207513-nginx-config\") pod \"whisker-655df5d5f-wr76l\" (UID: \"96443d8b-6798-4ea2-b154-8b1611207513\") " pod="calico-system/whisker-655df5d5f-wr76l" Apr 17 23:46:51.314743 kubelet[3392]: I0417 23:46:51.309397 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/96443d8b-6798-4ea2-b154-8b1611207513-whisker-backend-key-pair\") pod \"whisker-655df5d5f-wr76l\" (UID: \"96443d8b-6798-4ea2-b154-8b1611207513\") " pod="calico-system/whisker-655df5d5f-wr76l" Apr 17 23:46:51.314743 kubelet[3392]: I0417 23:46:51.309421 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjjz6\" (UniqueName: \"kubernetes.io/projected/96443d8b-6798-4ea2-b154-8b1611207513-kube-api-access-cjjz6\") pod \"whisker-655df5d5f-wr76l\" (UID: \"96443d8b-6798-4ea2-b154-8b1611207513\") " pod="calico-system/whisker-655df5d5f-wr76l" Apr 17 23:46:51.457021 containerd[1834]: time="2026-04-17T23:46:51.455642527Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dq7vb,Uid:4c94a196-84f1-4d7e-892d-1fd10da74241,Namespace:kube-system,Attempt:0,}" Apr 17 23:46:51.470944 containerd[1834]: time="2026-04-17T23:46:51.470904663Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7479bd7d89-zgl8b,Uid:c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f,Namespace:calico-system,Attempt:0,}" Apr 17 23:46:51.475129 containerd[1834]: time="2026-04-17T23:46:51.475090728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-n7zwr,Uid:0607b5b7-e84f-4371-964e-db63a77e29d1,Namespace:kube-system,Attempt:0,}" Apr 17 23:46:51.485932 containerd[1834]: time="2026-04-17T23:46:51.485884594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-568984f78-qbl55,Uid:499d7392-3858-4796-8ae5-2602f95a1e6c,Namespace:calico-system,Attempt:0,}" Apr 17 23:46:51.497587 containerd[1834]: time="2026-04-17T23:46:51.497399172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-568984f78-z9wh9,Uid:8636a294-eef0-499d-a578-9b4ce7de9cb5,Namespace:calico-system,Attempt:0,}" Apr 17 23:46:51.539059 containerd[1834]: time="2026-04-17T23:46:51.539015315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-655df5d5f-wr76l,Uid:96443d8b-6798-4ea2-b154-8b1611207513,Namespace:calico-system,Attempt:0,}" Apr 17 23:46:51.539331 containerd[1834]: time="2026-04-17T23:46:51.539303020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-d6rv9,Uid:4c2d4922-a2ca-483d-b7db-77418f884573,Namespace:calico-system,Attempt:0,}" Apr 17 23:46:52.087765 kubelet[3392]: I0417 23:46:52.085052 3392 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:46:52.088042 containerd[1834]: time="2026-04-17T23:46:52.086220568Z" level=info msg="StopPodSandbox for \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\"" Apr 17 23:46:52.088042 containerd[1834]: time="2026-04-17T23:46:52.087072981Z" level=info msg="Ensure that sandbox f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b in task-service has been cleanup successfully" Apr 17 23:46:52.130988 containerd[1834]: time="2026-04-17T23:46:52.130929458Z" level=error msg="StopPodSandbox for \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\" failed" error="failed to destroy network for sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.131221 kubelet[3392]: E0417 23:46:52.131180 3392 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:46:52.131329 kubelet[3392]: E0417 23:46:52.131244 3392 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b"} Apr 17 23:46:52.131329 kubelet[3392]: E0417 23:46:52.131312 3392 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"4658a59c-dd8b-4cd9-bac7-09b6c58f7e83\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 17 23:46:52.131510 kubelet[3392]: E0417 23:46:52.131340 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"4658a59c-dd8b-4cd9-bac7-09b6c58f7e83\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kdzpd" podUID="4658a59c-dd8b-4cd9-bac7-09b6c58f7e83" Apr 17 23:46:52.231370 containerd[1834]: time="2026-04-17T23:46:52.230884702Z" level=info msg="CreateContainer within sandbox \"7f3041d7a249ecda35607655cdc6a51687e0e0d2b294ee6699b1b6e8d67db523\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Apr 17 23:46:52.327033 containerd[1834]: time="2026-04-17T23:46:52.326954486Z" level=error msg="Failed to destroy network for sandbox \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.328243 containerd[1834]: time="2026-04-17T23:46:52.328182305Z" level=error msg="encountered an error cleaning up failed sandbox \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.328768 containerd[1834]: time="2026-04-17T23:46:52.328261506Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dq7vb,Uid:4c94a196-84f1-4d7e-892d-1fd10da74241,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.328853 kubelet[3392]: E0417 23:46:52.328568 3392 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.328853 kubelet[3392]: E0417 23:46:52.328685 3392 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dq7vb" Apr 17 23:46:52.328853 kubelet[3392]: E0417 23:46:52.328714 3392 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dq7vb" Apr 17 23:46:52.331631 kubelet[3392]: E0417 23:46:52.330557 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-dq7vb_kube-system(4c94a196-84f1-4d7e-892d-1fd10da74241)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-dq7vb_kube-system(4c94a196-84f1-4d7e-892d-1fd10da74241)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-dq7vb" podUID="4c94a196-84f1-4d7e-892d-1fd10da74241" Apr 17 23:46:52.333590 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b-shm.mount: Deactivated successfully. Apr 17 23:46:52.800660 containerd[1834]: time="2026-04-17T23:46:52.800608246Z" level=error msg="Failed to destroy network for sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.800964 containerd[1834]: time="2026-04-17T23:46:52.800930749Z" level=error msg="encountered an error cleaning up failed sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.801088 containerd[1834]: time="2026-04-17T23:46:52.800989950Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7479bd7d89-zgl8b,Uid:c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.801296 kubelet[3392]: E0417 23:46:52.801228 3392 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.801371 kubelet[3392]: E0417 23:46:52.801295 3392 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7479bd7d89-zgl8b" Apr 17 23:46:52.802128 kubelet[3392]: E0417 23:46:52.801324 3392 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7479bd7d89-zgl8b" Apr 17 23:46:52.802128 kubelet[3392]: E0417 23:46:52.801822 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7479bd7d89-zgl8b_calico-system(c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7479bd7d89-zgl8b_calico-system(c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7479bd7d89-zgl8b" podUID="c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f" Apr 17 23:46:52.872068 containerd[1834]: time="2026-04-17T23:46:52.872015222Z" level=info msg="CreateContainer within sandbox \"7f3041d7a249ecda35607655cdc6a51687e0e0d2b294ee6699b1b6e8d67db523\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"08aac514bf7f80b4095867869b4608810d89a76cb0fe897f0421c363f5d91ba3\"" Apr 17 23:46:52.874660 containerd[1834]: time="2026-04-17T23:46:52.873430333Z" level=info msg="StartContainer for \"08aac514bf7f80b4095867869b4608810d89a76cb0fe897f0421c363f5d91ba3\"" Apr 17 23:46:52.981715 containerd[1834]: time="2026-04-17T23:46:52.981644205Z" level=error msg="Failed to destroy network for sandbox \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.982061 containerd[1834]: time="2026-04-17T23:46:52.982020508Z" level=error msg="encountered an error cleaning up failed sandbox \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.982158 containerd[1834]: time="2026-04-17T23:46:52.982114009Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-n7zwr,Uid:0607b5b7-e84f-4371-964e-db63a77e29d1,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.983567 kubelet[3392]: E0417 23:46:52.982409 3392 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.983567 kubelet[3392]: E0417 23:46:52.982492 3392 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-n7zwr" Apr 17 23:46:52.983567 kubelet[3392]: E0417 23:46:52.982523 3392 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-n7zwr" Apr 17 23:46:52.983761 kubelet[3392]: E0417 23:46:52.982581 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-n7zwr_kube-system(0607b5b7-e84f-4371-964e-db63a77e29d1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-n7zwr_kube-system(0607b5b7-e84f-4371-964e-db63a77e29d1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-n7zwr" podUID="0607b5b7-e84f-4371-964e-db63a77e29d1" Apr 17 23:46:52.990062 containerd[1834]: time="2026-04-17T23:46:52.990017673Z" level=error msg="Failed to destroy network for sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.990625 containerd[1834]: time="2026-04-17T23:46:52.990458676Z" level=error msg="encountered an error cleaning up failed sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.990625 containerd[1834]: time="2026-04-17T23:46:52.990548277Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-655df5d5f-wr76l,Uid:96443d8b-6798-4ea2-b154-8b1611207513,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.991964 kubelet[3392]: E0417 23:46:52.991766 3392 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:52.991964 kubelet[3392]: E0417 23:46:52.991833 3392 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-655df5d5f-wr76l" Apr 17 23:46:52.991964 kubelet[3392]: E0417 23:46:52.991859 3392 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-655df5d5f-wr76l" Apr 17 23:46:52.992154 kubelet[3392]: E0417 23:46:52.991917 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-655df5d5f-wr76l_calico-system(96443d8b-6798-4ea2-b154-8b1611207513)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-655df5d5f-wr76l_calico-system(96443d8b-6798-4ea2-b154-8b1611207513)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-655df5d5f-wr76l" podUID="96443d8b-6798-4ea2-b154-8b1611207513" Apr 17 23:46:53.008098 containerd[1834]: time="2026-04-17T23:46:53.007245912Z" level=error msg="Failed to destroy network for sandbox \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.009874 containerd[1834]: time="2026-04-17T23:46:53.008633023Z" level=error msg="encountered an error cleaning up failed sandbox \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.009874 containerd[1834]: time="2026-04-17T23:46:53.008704123Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-568984f78-qbl55,Uid:499d7392-3858-4796-8ae5-2602f95a1e6c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.012449 kubelet[3392]: E0417 23:46:53.010973 3392 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.012449 kubelet[3392]: E0417 23:46:53.011033 3392 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-568984f78-qbl55" Apr 17 23:46:53.012449 kubelet[3392]: E0417 23:46:53.011057 3392 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-568984f78-qbl55" Apr 17 23:46:53.012739 kubelet[3392]: E0417 23:46:53.011110 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-568984f78-qbl55_calico-system(499d7392-3858-4796-8ae5-2602f95a1e6c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-568984f78-qbl55_calico-system(499d7392-3858-4796-8ae5-2602f95a1e6c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-568984f78-qbl55" podUID="499d7392-3858-4796-8ae5-2602f95a1e6c" Apr 17 23:46:53.046847 containerd[1834]: time="2026-04-17T23:46:53.046795330Z" level=error msg="Failed to destroy network for sandbox \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.047811 containerd[1834]: time="2026-04-17T23:46:53.047630737Z" level=error msg="encountered an error cleaning up failed sandbox \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.047811 containerd[1834]: time="2026-04-17T23:46:53.047700138Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-568984f78-z9wh9,Uid:8636a294-eef0-499d-a578-9b4ce7de9cb5,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.048060 kubelet[3392]: E0417 23:46:53.047932 3392 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.048060 kubelet[3392]: E0417 23:46:53.047998 3392 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-568984f78-z9wh9" Apr 17 23:46:53.048060 kubelet[3392]: E0417 23:46:53.048027 3392 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-568984f78-z9wh9" Apr 17 23:46:53.048280 kubelet[3392]: E0417 23:46:53.048087 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-568984f78-z9wh9_calico-system(8636a294-eef0-499d-a578-9b4ce7de9cb5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-568984f78-z9wh9_calico-system(8636a294-eef0-499d-a578-9b4ce7de9cb5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-568984f78-z9wh9" podUID="8636a294-eef0-499d-a578-9b4ce7de9cb5" Apr 17 23:46:53.070658 containerd[1834]: time="2026-04-17T23:46:53.070542022Z" level=error msg="Failed to destroy network for sandbox \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.071379 containerd[1834]: time="2026-04-17T23:46:53.071110026Z" level=error msg="encountered an error cleaning up failed sandbox \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.071379 containerd[1834]: time="2026-04-17T23:46:53.071173127Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-d6rv9,Uid:4c2d4922-a2ca-483d-b7db-77418f884573,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.072805 kubelet[3392]: E0417 23:46:53.071690 3392 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.072805 kubelet[3392]: E0417 23:46:53.071754 3392 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-d6rv9" Apr 17 23:46:53.072805 kubelet[3392]: E0417 23:46:53.071782 3392 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-d6rv9" Apr 17 23:46:53.073064 kubelet[3392]: E0417 23:46:53.071858 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-5b85766d88-d6rv9_calico-system(4c2d4922-a2ca-483d-b7db-77418f884573)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-5b85766d88-d6rv9_calico-system(4c2d4922-a2ca-483d-b7db-77418f884573)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-5b85766d88-d6rv9" podUID="4c2d4922-a2ca-483d-b7db-77418f884573" Apr 17 23:46:53.074631 containerd[1834]: time="2026-04-17T23:46:53.074349853Z" level=info msg="StartContainer for \"08aac514bf7f80b4095867869b4608810d89a76cb0fe897f0421c363f5d91ba3\" returns successfully" Apr 17 23:46:53.091681 kubelet[3392]: I0417 23:46:53.090425 3392 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:46:53.091785 containerd[1834]: time="2026-04-17T23:46:53.091217988Z" level=info msg="StopPodSandbox for \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\"" Apr 17 23:46:53.091785 containerd[1834]: time="2026-04-17T23:46:53.091410890Z" level=info msg="Ensure that sandbox 83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316 in task-service has been cleanup successfully" Apr 17 23:46:53.093222 kubelet[3392]: I0417 23:46:53.093196 3392 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:46:53.095363 containerd[1834]: time="2026-04-17T23:46:53.095338322Z" level=info msg="StopPodSandbox for \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\"" Apr 17 23:46:53.095847 containerd[1834]: time="2026-04-17T23:46:53.095822826Z" level=info msg="Ensure that sandbox e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6 in task-service has been cleanup successfully" Apr 17 23:46:53.108354 kubelet[3392]: I0417 23:46:53.108328 3392 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:46:53.115166 containerd[1834]: time="2026-04-17T23:46:53.115131781Z" level=info msg="StopPodSandbox for \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\"" Apr 17 23:46:53.115535 containerd[1834]: time="2026-04-17T23:46:53.115323883Z" level=info msg="Ensure that sandbox b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab in task-service has been cleanup successfully" Apr 17 23:46:53.126937 kubelet[3392]: I0417 23:46:53.126897 3392 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:46:53.137897 containerd[1834]: time="2026-04-17T23:46:53.137728863Z" level=info msg="StopPodSandbox for \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\"" Apr 17 23:46:53.138900 containerd[1834]: time="2026-04-17T23:46:53.138859972Z" level=info msg="Ensure that sandbox 900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae in task-service has been cleanup successfully" Apr 17 23:46:53.142647 kubelet[3392]: I0417 23:46:53.141702 3392 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:46:53.144085 containerd[1834]: time="2026-04-17T23:46:53.143715312Z" level=info msg="StopPodSandbox for \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\"" Apr 17 23:46:53.145371 containerd[1834]: time="2026-04-17T23:46:53.144844621Z" level=info msg="Ensure that sandbox 61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6 in task-service has been cleanup successfully" Apr 17 23:46:53.145743 kubelet[3392]: I0417 23:46:53.145721 3392 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:46:53.147378 containerd[1834]: time="2026-04-17T23:46:53.147352741Z" level=info msg="StopPodSandbox for \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\"" Apr 17 23:46:53.150697 containerd[1834]: time="2026-04-17T23:46:53.150352965Z" level=info msg="Ensure that sandbox ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3 in task-service has been cleanup successfully" Apr 17 23:46:53.153045 kubelet[3392]: I0417 23:46:53.153025 3392 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:46:53.154332 containerd[1834]: time="2026-04-17T23:46:53.154309597Z" level=info msg="StopPodSandbox for \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\"" Apr 17 23:46:53.158790 containerd[1834]: time="2026-04-17T23:46:53.158765233Z" level=info msg="Ensure that sandbox 1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b in task-service has been cleanup successfully" Apr 17 23:46:53.208895 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3-shm.mount: Deactivated successfully. Apr 17 23:46:53.209360 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316-shm.mount: Deactivated successfully. Apr 17 23:46:53.332002 kubelet[3392]: I0417 23:46:53.331843 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-n24kb" podStartSLOduration=18.00074257 podStartE2EDuration="50.331794627s" podCreationTimestamp="2026-04-17 23:46:03 +0000 UTC" firstStartedPulling="2026-04-17 23:46:05.384114972 +0000 UTC m=+19.110080914" lastFinishedPulling="2026-04-17 23:46:37.715167029 +0000 UTC m=+51.441132971" observedRunningTime="2026-04-17 23:46:53.137138659 +0000 UTC m=+66.863104601" watchObservedRunningTime="2026-04-17 23:46:53.331794627 +0000 UTC m=+67.057760669" Apr 17 23:46:53.373422 containerd[1834]: time="2026-04-17T23:46:53.372532556Z" level=error msg="StopPodSandbox for \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\" failed" error="failed to destroy network for sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.374171 kubelet[3392]: E0417 23:46:53.373229 3392 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:46:53.374171 kubelet[3392]: E0417 23:46:53.373294 3392 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6"} Apr 17 23:46:53.374171 kubelet[3392]: E0417 23:46:53.373337 3392 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"96443d8b-6798-4ea2-b154-8b1611207513\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 17 23:46:53.374171 kubelet[3392]: E0417 23:46:53.373368 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"96443d8b-6798-4ea2-b154-8b1611207513\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-655df5d5f-wr76l" podUID="96443d8b-6798-4ea2-b154-8b1611207513" Apr 17 23:46:53.397036 containerd[1834]: time="2026-04-17T23:46:53.394619734Z" level=error msg="StopPodSandbox for \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\" failed" error="failed to destroy network for sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 17 23:46:53.397201 kubelet[3392]: E0417 23:46:53.395601 3392 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:46:53.397201 kubelet[3392]: E0417 23:46:53.395657 3392 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316"} Apr 17 23:46:53.397201 kubelet[3392]: E0417 23:46:53.395700 3392 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 17 23:46:53.397201 kubelet[3392]: E0417 23:46:53.395734 3392 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7479bd7d89-zgl8b" podUID="c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f" Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.576 [INFO][4816] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.576 [INFO][4816] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" iface="eth0" netns="/var/run/netns/cni-569eafba-3c89-8906-be09-7e9d59b0a894" Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.578 [INFO][4816] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" iface="eth0" netns="/var/run/netns/cni-569eafba-3c89-8906-be09-7e9d59b0a894" Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.578 [INFO][4816] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" iface="eth0" netns="/var/run/netns/cni-569eafba-3c89-8906-be09-7e9d59b0a894" Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.578 [INFO][4816] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.578 [INFO][4816] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.733 [INFO][4866] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" HandleID="k8s-pod-network.1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.733 [INFO][4866] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.733 [INFO][4866] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.758 [WARNING][4866] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" HandleID="k8s-pod-network.1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.758 [INFO][4866] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" HandleID="k8s-pod-network.1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.762 [INFO][4866] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:53.797610 containerd[1834]: 2026-04-17 23:46:53.786 [INFO][4816] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:46:53.803149 containerd[1834]: time="2026-04-17T23:46:53.797832783Z" level=info msg="TearDown network for sandbox \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\" successfully" Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.569 [INFO][4814] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.569 [INFO][4814] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" iface="eth0" netns="/var/run/netns/cni-2ac6b44c-036e-12ca-253d-11f3e27ca9f9" Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.572 [INFO][4814] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" iface="eth0" netns="/var/run/netns/cni-2ac6b44c-036e-12ca-253d-11f3e27ca9f9" Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.572 [INFO][4814] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" iface="eth0" netns="/var/run/netns/cni-2ac6b44c-036e-12ca-253d-11f3e27ca9f9" Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.572 [INFO][4814] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.572 [INFO][4814] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.750 [INFO][4859] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" HandleID="k8s-pod-network.61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.752 [INFO][4859] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.762 [INFO][4859] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.778 [WARNING][4859] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" HandleID="k8s-pod-network.61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.778 [INFO][4859] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" HandleID="k8s-pod-network.61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.779 [INFO][4859] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:53.803149 containerd[1834]: 2026-04-17 23:46:53.789 [INFO][4814] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:46:53.803149 containerd[1834]: time="2026-04-17T23:46:53.797864883Z" level=info msg="StopPodSandbox for \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\" returns successfully" Apr 17 23:46:53.803149 containerd[1834]: time="2026-04-17T23:46:53.798294787Z" level=info msg="TearDown network for sandbox \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\" successfully" Apr 17 23:46:53.803149 containerd[1834]: time="2026-04-17T23:46:53.798379887Z" level=info msg="StopPodSandbox for \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\" returns successfully" Apr 17 23:46:53.809580 systemd[1]: run-netns-cni\x2d2ac6b44c\x2d036e\x2d12ca\x2d253d\x2d11f3e27ca9f9.mount: Deactivated successfully. Apr 17 23:46:53.809774 systemd[1]: run-netns-cni\x2d569eafba\x2d3c89\x2d8906\x2dbe09\x2d7e9d59b0a894.mount: Deactivated successfully. Apr 17 23:46:53.815744 containerd[1834]: time="2026-04-17T23:46:53.815602326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dq7vb,Uid:4c94a196-84f1-4d7e-892d-1fd10da74241,Namespace:kube-system,Attempt:1,}" Apr 17 23:46:53.817227 containerd[1834]: time="2026-04-17T23:46:53.817195639Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-568984f78-qbl55,Uid:499d7392-3858-4796-8ae5-2602f95a1e6c,Namespace:calico-system,Attempt:1,}" Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.564 [INFO][4813] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.565 [INFO][4813] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" iface="eth0" netns="/var/run/netns/cni-d1b55b94-aa62-f2f9-0343-863fafc4bc6f" Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.565 [INFO][4813] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" iface="eth0" netns="/var/run/netns/cni-d1b55b94-aa62-f2f9-0343-863fafc4bc6f" Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.576 [INFO][4813] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" iface="eth0" netns="/var/run/netns/cni-d1b55b94-aa62-f2f9-0343-863fafc4bc6f" Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.579 [INFO][4813] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.579 [INFO][4813] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.777 [INFO][4865] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" HandleID="k8s-pod-network.900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.781 [INFO][4865] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.781 [INFO][4865] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.810 [WARNING][4865] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" HandleID="k8s-pod-network.900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.810 [INFO][4865] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" HandleID="k8s-pod-network.900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.816 [INFO][4865] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:53.823292 containerd[1834]: 2026-04-17 23:46:53.819 [INFO][4813] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:46:53.824041 containerd[1834]: time="2026-04-17T23:46:53.823402389Z" level=info msg="TearDown network for sandbox \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\" successfully" Apr 17 23:46:53.824041 containerd[1834]: time="2026-04-17T23:46:53.823431289Z" level=info msg="StopPodSandbox for \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\" returns successfully" Apr 17 23:46:53.824325 containerd[1834]: time="2026-04-17T23:46:53.824144795Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-568984f78-z9wh9,Uid:8636a294-eef0-499d-a578-9b4ce7de9cb5,Namespace:calico-system,Attempt:1,}" Apr 17 23:46:53.829554 systemd[1]: run-netns-cni\x2dd1b55b94\x2daa62\x2df2f9\x2d0343\x2d863fafc4bc6f.mount: Deactivated successfully. Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.602 [INFO][4794] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.606 [INFO][4794] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" iface="eth0" netns="/var/run/netns/cni-07a93b3e-4f81-0806-22d8-6d25a8626167" Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.610 [INFO][4794] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" iface="eth0" netns="/var/run/netns/cni-07a93b3e-4f81-0806-22d8-6d25a8626167" Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.618 [INFO][4794] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" iface="eth0" netns="/var/run/netns/cni-07a93b3e-4f81-0806-22d8-6d25a8626167" Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.618 [INFO][4794] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.618 [INFO][4794] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.816 [INFO][4876] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" HandleID="k8s-pod-network.b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.816 [INFO][4876] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.816 [INFO][4876] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.832 [WARNING][4876] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" HandleID="k8s-pod-network.b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.832 [INFO][4876] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" HandleID="k8s-pod-network.b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.834 [INFO][4876] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:53.839673 containerd[1834]: 2026-04-17 23:46:53.835 [INFO][4794] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:46:53.839673 containerd[1834]: time="2026-04-17T23:46:53.838897614Z" level=info msg="TearDown network for sandbox \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\" successfully" Apr 17 23:46:53.839673 containerd[1834]: time="2026-04-17T23:46:53.838947014Z" level=info msg="StopPodSandbox for \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\" returns successfully" Apr 17 23:46:53.849766 containerd[1834]: time="2026-04-17T23:46:53.847307982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-d6rv9,Uid:4c2d4922-a2ca-483d-b7db-77418f884573,Namespace:calico-system,Attempt:1,}" Apr 17 23:46:53.850564 systemd[1]: run-netns-cni\x2d07a93b3e\x2d4f81\x2d0806\x2d22d8\x2d6d25a8626167.mount: Deactivated successfully. Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.611 [INFO][4812] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.611 [INFO][4812] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" iface="eth0" netns="/var/run/netns/cni-f3cb7185-4561-a46a-c8f0-35af0faf38ce" Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.613 [INFO][4812] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" iface="eth0" netns="/var/run/netns/cni-f3cb7185-4561-a46a-c8f0-35af0faf38ce" Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.613 [INFO][4812] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" iface="eth0" netns="/var/run/netns/cni-f3cb7185-4561-a46a-c8f0-35af0faf38ce" Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.613 [INFO][4812] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.613 [INFO][4812] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.818 [INFO][4874] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" HandleID="k8s-pod-network.ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.820 [INFO][4874] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.834 [INFO][4874] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.853 [WARNING][4874] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" HandleID="k8s-pod-network.ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.854 [INFO][4874] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" HandleID="k8s-pod-network.ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.856 [INFO][4874] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:53.859374 containerd[1834]: 2026-04-17 23:46:53.857 [INFO][4812] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:46:53.859916 containerd[1834]: time="2026-04-17T23:46:53.859423279Z" level=info msg="TearDown network for sandbox \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\" successfully" Apr 17 23:46:53.859916 containerd[1834]: time="2026-04-17T23:46:53.859450380Z" level=info msg="StopPodSandbox for \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\" returns successfully" Apr 17 23:46:53.861812 containerd[1834]: time="2026-04-17T23:46:53.861778098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-n7zwr,Uid:0607b5b7-e84f-4371-964e-db63a77e29d1,Namespace:kube-system,Attempt:1,}" Apr 17 23:46:54.160410 containerd[1834]: time="2026-04-17T23:46:54.159565398Z" level=info msg="StopPodSandbox for \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\"" Apr 17 23:46:54.229059 systemd[1]: run-netns-cni\x2df3cb7185\x2d4561\x2da46a\x2dc8f0\x2d35af0faf38ce.mount: Deactivated successfully. Apr 17 23:46:54.299852 systemd-networkd[1395]: caliadaf23cd2dd: Link UP Apr 17 23:46:54.300727 systemd-networkd[1395]: caliadaf23cd2dd: Gained carrier Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:53.974 [ERROR][4906] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:53.996 [INFO][4906] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0 calico-apiserver-568984f78- calico-system 499d7392-3858-4796-8ae5-2602f95a1e6c 990 0 2026-04-17 23:46:02 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:568984f78 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081.3.6-n-7b570e9a3c calico-apiserver-568984f78-qbl55 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] caliadaf23cd2dd [] [] }} ContainerID="afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" Namespace="calico-system" Pod="calico-apiserver-568984f78-qbl55" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:53.996 [INFO][4906] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" Namespace="calico-system" Pod="calico-apiserver-568984f78-qbl55" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.121 [INFO][4948] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" HandleID="k8s-pod-network.afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.152 [INFO][4948] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" HandleID="k8s-pod-network.afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003989d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-7b570e9a3c", "pod":"calico-apiserver-568984f78-qbl55", "timestamp":"2026-04-17 23:46:54.121405291 +0000 UTC"}, Hostname:"ci-4081.3.6-n-7b570e9a3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002c0000)} Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.153 [INFO][4948] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.153 [INFO][4948] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.153 [INFO][4948] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-7b570e9a3c' Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.172 [INFO][4948] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.180 [INFO][4948] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.189 [INFO][4948] ipam/ipam.go 526: Trying affinity for 192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.194 [INFO][4948] ipam/ipam.go 160: Attempting to load block cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.205 [INFO][4948] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.205 [INFO][4948] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.210 [INFO][4948] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3 Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.219 [INFO][4948] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.227 [INFO][4948] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.71.1/26] block=192.168.71.0/26 handle="k8s-pod-network.afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.227 [INFO][4948] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.71.1/26] handle="k8s-pod-network.afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.228 [INFO][4948] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:54.341765 containerd[1834]: 2026-04-17 23:46:54.228 [INFO][4948] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.71.1/26] IPv6=[] ContainerID="afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" HandleID="k8s-pod-network.afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:46:54.342749 containerd[1834]: 2026-04-17 23:46:54.242 [INFO][4906] cni-plugin/k8s.go 418: Populated endpoint ContainerID="afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" Namespace="calico-system" Pod="calico-apiserver-568984f78-qbl55" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0", GenerateName:"calico-apiserver-568984f78-", Namespace:"calico-system", SelfLink:"", UID:"499d7392-3858-4796-8ae5-2602f95a1e6c", ResourceVersion:"990", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"568984f78", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"", Pod:"calico-apiserver-568984f78-qbl55", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliadaf23cd2dd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:54.342749 containerd[1834]: 2026-04-17 23:46:54.242 [INFO][4906] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.1/32] ContainerID="afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" Namespace="calico-system" Pod="calico-apiserver-568984f78-qbl55" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:46:54.342749 containerd[1834]: 2026-04-17 23:46:54.243 [INFO][4906] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliadaf23cd2dd ContainerID="afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" Namespace="calico-system" Pod="calico-apiserver-568984f78-qbl55" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:46:54.342749 containerd[1834]: 2026-04-17 23:46:54.301 [INFO][4906] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" Namespace="calico-system" Pod="calico-apiserver-568984f78-qbl55" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:46:54.342749 containerd[1834]: 2026-04-17 23:46:54.303 [INFO][4906] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" Namespace="calico-system" Pod="calico-apiserver-568984f78-qbl55" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0", GenerateName:"calico-apiserver-568984f78-", Namespace:"calico-system", SelfLink:"", UID:"499d7392-3858-4796-8ae5-2602f95a1e6c", ResourceVersion:"990", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"568984f78", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3", Pod:"calico-apiserver-568984f78-qbl55", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliadaf23cd2dd", MAC:"0e:be:f3:3b:24:38", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:54.342749 containerd[1834]: 2026-04-17 23:46:54.339 [INFO][4906] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3" Namespace="calico-system" Pod="calico-apiserver-568984f78-qbl55" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:46:54.377971 systemd-networkd[1395]: cali1bf13c3a279: Link UP Apr 17 23:46:54.380012 systemd-networkd[1395]: cali1bf13c3a279: Gained carrier Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.058 [ERROR][4926] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.083 [INFO][4926] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0 calico-apiserver-568984f78- calico-system 8636a294-eef0-499d-a578-9b4ce7de9cb5 988 0 2026-04-17 23:46:02 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:568984f78 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081.3.6-n-7b570e9a3c calico-apiserver-568984f78-z9wh9 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali1bf13c3a279 [] [] }} ContainerID="20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" Namespace="calico-system" Pod="calico-apiserver-568984f78-z9wh9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.083 [INFO][4926] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" Namespace="calico-system" Pod="calico-apiserver-568984f78-z9wh9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.142 [INFO][4970] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" HandleID="k8s-pod-network.20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.162 [INFO][4970] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" HandleID="k8s-pod-network.20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fb390), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-7b570e9a3c", "pod":"calico-apiserver-568984f78-z9wh9", "timestamp":"2026-04-17 23:46:54.142553361 +0000 UTC"}, Hostname:"ci-4081.3.6-n-7b570e9a3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002caf20)} Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.162 [INFO][4970] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.232 [INFO][4970] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.232 [INFO][4970] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-7b570e9a3c' Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.266 [INFO][4970] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.278 [INFO][4970] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.286 [INFO][4970] ipam/ipam.go 526: Trying affinity for 192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.297 [INFO][4970] ipam/ipam.go 160: Attempting to load block cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.304 [INFO][4970] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.304 [INFO][4970] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.313 [INFO][4970] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.337 [INFO][4970] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.358 [INFO][4970] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.71.2/26] block=192.168.71.0/26 handle="k8s-pod-network.20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.358 [INFO][4970] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.71.2/26] handle="k8s-pod-network.20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.358 [INFO][4970] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:54.425392 containerd[1834]: 2026-04-17 23:46:54.358 [INFO][4970] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.71.2/26] IPv6=[] ContainerID="20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" HandleID="k8s-pod-network.20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:46:54.430381 containerd[1834]: 2026-04-17 23:46:54.368 [INFO][4926] cni-plugin/k8s.go 418: Populated endpoint ContainerID="20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" Namespace="calico-system" Pod="calico-apiserver-568984f78-z9wh9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0", GenerateName:"calico-apiserver-568984f78-", Namespace:"calico-system", SelfLink:"", UID:"8636a294-eef0-499d-a578-9b4ce7de9cb5", ResourceVersion:"988", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"568984f78", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"", Pod:"calico-apiserver-568984f78-z9wh9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1bf13c3a279", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:54.430381 containerd[1834]: 2026-04-17 23:46:54.368 [INFO][4926] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.2/32] ContainerID="20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" Namespace="calico-system" Pod="calico-apiserver-568984f78-z9wh9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:46:54.430381 containerd[1834]: 2026-04-17 23:46:54.368 [INFO][4926] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1bf13c3a279 ContainerID="20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" Namespace="calico-system" Pod="calico-apiserver-568984f78-z9wh9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:46:54.430381 containerd[1834]: 2026-04-17 23:46:54.382 [INFO][4926] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" Namespace="calico-system" Pod="calico-apiserver-568984f78-z9wh9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:46:54.430381 containerd[1834]: 2026-04-17 23:46:54.386 [INFO][4926] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" Namespace="calico-system" Pod="calico-apiserver-568984f78-z9wh9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0", GenerateName:"calico-apiserver-568984f78-", Namespace:"calico-system", SelfLink:"", UID:"8636a294-eef0-499d-a578-9b4ce7de9cb5", ResourceVersion:"988", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"568984f78", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e", Pod:"calico-apiserver-568984f78-z9wh9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1bf13c3a279", MAC:"3e:0c:a4:37:8b:f5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:54.430381 containerd[1834]: 2026-04-17 23:46:54.414 [INFO][4926] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e" Namespace="calico-system" Pod="calico-apiserver-568984f78-z9wh9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:46:54.452694 containerd[1834]: time="2026-04-17T23:46:54.452277657Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:46:54.452694 containerd[1834]: time="2026-04-17T23:46:54.452374958Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:46:54.452694 containerd[1834]: time="2026-04-17T23:46:54.452391558Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:54.452694 containerd[1834]: time="2026-04-17T23:46:54.452568760Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:54.513392 systemd-networkd[1395]: cali8d2ca757d64: Link UP Apr 17 23:46:54.516972 systemd-networkd[1395]: cali8d2ca757d64: Gained carrier Apr 17 23:46:54.549128 containerd[1834]: time="2026-04-17T23:46:54.548632734Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:46:54.549128 containerd[1834]: time="2026-04-17T23:46:54.548701434Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:46:54.549128 containerd[1834]: time="2026-04-17T23:46:54.548722034Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.022 [ERROR][4918] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.049 [INFO][4918] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0 coredns-674b8bbfcf- kube-system 4c94a196-84f1-4d7e-892d-1fd10da74241 989 0 2026-04-17 23:45:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081.3.6-n-7b570e9a3c coredns-674b8bbfcf-dq7vb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8d2ca757d64 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" Namespace="kube-system" Pod="coredns-674b8bbfcf-dq7vb" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.050 [INFO][4918] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" Namespace="kube-system" Pod="coredns-674b8bbfcf-dq7vb" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.176 [INFO][4965] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" HandleID="k8s-pod-network.9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.191 [INFO][4965] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" HandleID="k8s-pod-network.9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000123f70), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081.3.6-n-7b570e9a3c", "pod":"coredns-674b8bbfcf-dq7vb", "timestamp":"2026-04-17 23:46:54.176128132 +0000 UTC"}, Hostname:"ci-4081.3.6-n-7b570e9a3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0000f0580)} Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.191 [INFO][4965] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.359 [INFO][4965] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.359 [INFO][4965] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-7b570e9a3c' Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.367 [INFO][4965] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.383 [INFO][4965] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.417 [INFO][4965] ipam/ipam.go 526: Trying affinity for 192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.420 [INFO][4965] ipam/ipam.go 160: Attempting to load block cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.425 [INFO][4965] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.425 [INFO][4965] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.427 [INFO][4965] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.441 [INFO][4965] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.458 [INFO][4965] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.71.3/26] block=192.168.71.0/26 handle="k8s-pod-network.9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.458 [INFO][4965] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.71.3/26] handle="k8s-pod-network.9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.459 [INFO][4965] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:54.550113 containerd[1834]: 2026-04-17 23:46:54.460 [INFO][4965] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.71.3/26] IPv6=[] ContainerID="9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" HandleID="k8s-pod-network.9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:46:54.551055 containerd[1834]: 2026-04-17 23:46:54.486 [INFO][4918] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" Namespace="kube-system" Pod="coredns-674b8bbfcf-dq7vb" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4c94a196-84f1-4d7e-892d-1fd10da74241", ResourceVersion:"989", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 45, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"", Pod:"coredns-674b8bbfcf-dq7vb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8d2ca757d64", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:54.551055 containerd[1834]: 2026-04-17 23:46:54.486 [INFO][4918] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.3/32] ContainerID="9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" Namespace="kube-system" Pod="coredns-674b8bbfcf-dq7vb" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:46:54.551055 containerd[1834]: 2026-04-17 23:46:54.486 [INFO][4918] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8d2ca757d64 ContainerID="9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" Namespace="kube-system" Pod="coredns-674b8bbfcf-dq7vb" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:46:54.551055 containerd[1834]: 2026-04-17 23:46:54.523 [INFO][4918] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" Namespace="kube-system" Pod="coredns-674b8bbfcf-dq7vb" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:46:54.551055 containerd[1834]: 2026-04-17 23:46:54.527 [INFO][4918] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" Namespace="kube-system" Pod="coredns-674b8bbfcf-dq7vb" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4c94a196-84f1-4d7e-892d-1fd10da74241", ResourceVersion:"989", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 45, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d", Pod:"coredns-674b8bbfcf-dq7vb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8d2ca757d64", MAC:"32:e7:41:af:29:a5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:54.551055 containerd[1834]: 2026-04-17 23:46:54.546 [INFO][4918] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d" Namespace="kube-system" Pod="coredns-674b8bbfcf-dq7vb" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:46:54.552542 containerd[1834]: time="2026-04-17T23:46:54.548829635Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:54.614193 systemd-networkd[1395]: cali5288e1165cd: Link UP Apr 17 23:46:54.617355 systemd-networkd[1395]: cali5288e1165cd: Gained carrier Apr 17 23:46:54.637390 containerd[1834]: time="2026-04-17T23:46:54.635538734Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:46:54.637390 containerd[1834]: time="2026-04-17T23:46:54.635706435Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:46:54.637390 containerd[1834]: time="2026-04-17T23:46:54.635757736Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:54.637390 containerd[1834]: time="2026-04-17T23:46:54.637100147Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.121 [ERROR][4935] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.169 [INFO][4935] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0 goldmane-5b85766d88- calico-system 4c2d4922-a2ca-483d-b7db-77418f884573 992 0 2026-04-17 23:46:02 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:5b85766d88 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081.3.6-n-7b570e9a3c goldmane-5b85766d88-d6rv9 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali5288e1165cd [] [] }} ContainerID="361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" Namespace="calico-system" Pod="goldmane-5b85766d88-d6rv9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.169 [INFO][4935] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" Namespace="calico-system" Pod="goldmane-5b85766d88-d6rv9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.251 [INFO][4997] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" HandleID="k8s-pod-network.361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.266 [INFO][4997] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" HandleID="k8s-pod-network.361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd840), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-7b570e9a3c", "pod":"goldmane-5b85766d88-d6rv9", "timestamp":"2026-04-17 23:46:54.251387038 +0000 UTC"}, Hostname:"ci-4081.3.6-n-7b570e9a3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0004371e0)} Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.266 [INFO][4997] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.458 [INFO][4997] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.458 [INFO][4997] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-7b570e9a3c' Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.472 [INFO][4997] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.500 [INFO][4997] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.528 [INFO][4997] ipam/ipam.go 526: Trying affinity for 192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.537 [INFO][4997] ipam/ipam.go 160: Attempting to load block cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.546 [INFO][4997] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.547 [INFO][4997] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.551 [INFO][4997] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.560 [INFO][4997] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.583 [INFO][4997] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.71.4/26] block=192.168.71.0/26 handle="k8s-pod-network.361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.583 [INFO][4997] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.71.4/26] handle="k8s-pod-network.361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.583 [INFO][4997] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:54.659156 containerd[1834]: 2026-04-17 23:46:54.583 [INFO][4997] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.71.4/26] IPv6=[] ContainerID="361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" HandleID="k8s-pod-network.361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:46:54.660734 containerd[1834]: 2026-04-17 23:46:54.599 [INFO][4935] cni-plugin/k8s.go 418: Populated endpoint ContainerID="361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" Namespace="calico-system" Pod="goldmane-5b85766d88-d6rv9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"4c2d4922-a2ca-483d-b7db-77418f884573", ResourceVersion:"992", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"", Pod:"goldmane-5b85766d88-d6rv9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.71.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5288e1165cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:54.660734 containerd[1834]: 2026-04-17 23:46:54.599 [INFO][4935] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.4/32] ContainerID="361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" Namespace="calico-system" Pod="goldmane-5b85766d88-d6rv9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:46:54.660734 containerd[1834]: 2026-04-17 23:46:54.599 [INFO][4935] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5288e1165cd ContainerID="361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" Namespace="calico-system" Pod="goldmane-5b85766d88-d6rv9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:46:54.660734 containerd[1834]: 2026-04-17 23:46:54.620 [INFO][4935] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" Namespace="calico-system" Pod="goldmane-5b85766d88-d6rv9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:46:54.660734 containerd[1834]: 2026-04-17 23:46:54.628 [INFO][4935] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" Namespace="calico-system" Pod="goldmane-5b85766d88-d6rv9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"4c2d4922-a2ca-483d-b7db-77418f884573", ResourceVersion:"992", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d", Pod:"goldmane-5b85766d88-d6rv9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.71.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5288e1165cd", MAC:"ea:19:84:cd:bc:55", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:54.660734 containerd[1834]: 2026-04-17 23:46:54.656 [INFO][4935] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d" Namespace="calico-system" Pod="goldmane-5b85766d88-d6rv9" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:46:54.674809 systemd-networkd[1395]: cali3e07900ecbf: Link UP Apr 17 23:46:54.675832 systemd-networkd[1395]: cali3e07900ecbf: Gained carrier Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.360 [INFO][5006] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.360 [INFO][5006] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" iface="eth0" netns="/var/run/netns/cni-5d20a5f7-2c39-2e84-e631-cd547d94b8f7" Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.361 [INFO][5006] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" iface="eth0" netns="/var/run/netns/cni-5d20a5f7-2c39-2e84-e631-cd547d94b8f7" Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.361 [INFO][5006] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" iface="eth0" netns="/var/run/netns/cni-5d20a5f7-2c39-2e84-e631-cd547d94b8f7" Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.361 [INFO][5006] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.362 [INFO][5006] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.574 [INFO][5049] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" HandleID="k8s-pod-network.e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--655df5d5f--wr76l-eth0" Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.579 [INFO][5049] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.648 [INFO][5049] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.673 [WARNING][5049] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" HandleID="k8s-pod-network.e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--655df5d5f--wr76l-eth0" Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.673 [INFO][5049] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" HandleID="k8s-pod-network.e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--655df5d5f--wr76l-eth0" Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.681 [INFO][5049] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:54.689953 containerd[1834]: 2026-04-17 23:46:54.686 [INFO][5006] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:46:54.691030 containerd[1834]: time="2026-04-17T23:46:54.690764979Z" level=info msg="TearDown network for sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\" successfully" Apr 17 23:46:54.691030 containerd[1834]: time="2026-04-17T23:46:54.690913580Z" level=info msg="StopPodSandbox for \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\" returns successfully" Apr 17 23:46:54.702010 containerd[1834]: time="2026-04-17T23:46:54.701147263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-568984f78-qbl55,Uid:499d7392-3858-4796-8ae5-2602f95a1e6c,Namespace:calico-system,Attempt:1,} returns sandbox id \"afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3\"" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.118 [ERROR][4949] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.160 [INFO][4949] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0 coredns-674b8bbfcf- kube-system 0607b5b7-e84f-4371-964e-db63a77e29d1 991 0 2026-04-17 23:45:51 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081.3.6-n-7b570e9a3c coredns-674b8bbfcf-n7zwr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3e07900ecbf [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" Namespace="kube-system" Pod="coredns-674b8bbfcf-n7zwr" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.160 [INFO][4949] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" Namespace="kube-system" Pod="coredns-674b8bbfcf-n7zwr" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.272 [INFO][4988] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" HandleID="k8s-pod-network.f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.334 [INFO][4988] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" HandleID="k8s-pod-network.f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000427b20), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081.3.6-n-7b570e9a3c", "pod":"coredns-674b8bbfcf-n7zwr", "timestamp":"2026-04-17 23:46:54.272448508 +0000 UTC"}, Hostname:"ci-4081.3.6-n-7b570e9a3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00052bb80)} Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.334 [INFO][4988] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.583 [INFO][4988] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.584 [INFO][4988] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-7b570e9a3c' Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.590 [INFO][4988] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.602 [INFO][4988] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.609 [INFO][4988] ipam/ipam.go 526: Trying affinity for 192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.611 [INFO][4988] ipam/ipam.go 160: Attempting to load block cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.614 [INFO][4988] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.614 [INFO][4988] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.616 [INFO][4988] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650 Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.627 [INFO][4988] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.647 [INFO][4988] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.71.5/26] block=192.168.71.0/26 handle="k8s-pod-network.f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.648 [INFO][4988] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.71.5/26] handle="k8s-pod-network.f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.648 [INFO][4988] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:54.723932 containerd[1834]: 2026-04-17 23:46:54.649 [INFO][4988] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.71.5/26] IPv6=[] ContainerID="f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" HandleID="k8s-pod-network.f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:46:54.726388 containerd[1834]: 2026-04-17 23:46:54.660 [INFO][4949] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" Namespace="kube-system" Pod="coredns-674b8bbfcf-n7zwr" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0607b5b7-e84f-4371-964e-db63a77e29d1", ResourceVersion:"991", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 45, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"", Pod:"coredns-674b8bbfcf-n7zwr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3e07900ecbf", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:54.726388 containerd[1834]: 2026-04-17 23:46:54.660 [INFO][4949] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.5/32] ContainerID="f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" Namespace="kube-system" Pod="coredns-674b8bbfcf-n7zwr" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:46:54.726388 containerd[1834]: 2026-04-17 23:46:54.660 [INFO][4949] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3e07900ecbf ContainerID="f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" Namespace="kube-system" Pod="coredns-674b8bbfcf-n7zwr" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:46:54.726388 containerd[1834]: 2026-04-17 23:46:54.674 [INFO][4949] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" Namespace="kube-system" Pod="coredns-674b8bbfcf-n7zwr" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:46:54.726388 containerd[1834]: 2026-04-17 23:46:54.675 [INFO][4949] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" Namespace="kube-system" Pod="coredns-674b8bbfcf-n7zwr" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0607b5b7-e84f-4371-964e-db63a77e29d1", ResourceVersion:"991", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 45, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650", Pod:"coredns-674b8bbfcf-n7zwr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3e07900ecbf", MAC:"de:01:65:64:04:a6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:54.726388 containerd[1834]: 2026-04-17 23:46:54.710 [INFO][4949] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650" Namespace="kube-system" Pod="coredns-674b8bbfcf-n7zwr" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:46:54.739401 containerd[1834]: time="2026-04-17T23:46:54.739358171Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 17 23:46:54.743776 kubelet[3392]: I0417 23:46:54.741797 3392 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjjz6\" (UniqueName: \"kubernetes.io/projected/96443d8b-6798-4ea2-b154-8b1611207513-kube-api-access-cjjz6\") pod \"96443d8b-6798-4ea2-b154-8b1611207513\" (UID: \"96443d8b-6798-4ea2-b154-8b1611207513\") " Apr 17 23:46:54.743776 kubelet[3392]: I0417 23:46:54.741859 3392 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/96443d8b-6798-4ea2-b154-8b1611207513-whisker-backend-key-pair\") pod \"96443d8b-6798-4ea2-b154-8b1611207513\" (UID: \"96443d8b-6798-4ea2-b154-8b1611207513\") " Apr 17 23:46:54.743776 kubelet[3392]: I0417 23:46:54.741907 3392 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96443d8b-6798-4ea2-b154-8b1611207513-whisker-ca-bundle\") pod \"96443d8b-6798-4ea2-b154-8b1611207513\" (UID: \"96443d8b-6798-4ea2-b154-8b1611207513\") " Apr 17 23:46:54.743776 kubelet[3392]: I0417 23:46:54.741933 3392 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/96443d8b-6798-4ea2-b154-8b1611207513-nginx-config\") pod \"96443d8b-6798-4ea2-b154-8b1611207513\" (UID: \"96443d8b-6798-4ea2-b154-8b1611207513\") " Apr 17 23:46:54.743776 kubelet[3392]: I0417 23:46:54.742657 3392 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96443d8b-6798-4ea2-b154-8b1611207513-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "96443d8b-6798-4ea2-b154-8b1611207513" (UID: "96443d8b-6798-4ea2-b154-8b1611207513"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 23:46:54.768501 kubelet[3392]: I0417 23:46:54.763192 3392 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96443d8b-6798-4ea2-b154-8b1611207513-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "96443d8b-6798-4ea2-b154-8b1611207513" (UID: "96443d8b-6798-4ea2-b154-8b1611207513"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 23:46:54.768501 kubelet[3392]: I0417 23:46:54.764596 3392 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96443d8b-6798-4ea2-b154-8b1611207513-kube-api-access-cjjz6" (OuterVolumeSpecName: "kube-api-access-cjjz6") pod "96443d8b-6798-4ea2-b154-8b1611207513" (UID: "96443d8b-6798-4ea2-b154-8b1611207513"). InnerVolumeSpecName "kube-api-access-cjjz6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 23:46:54.770990 kubelet[3392]: I0417 23:46:54.770953 3392 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96443d8b-6798-4ea2-b154-8b1611207513-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "96443d8b-6798-4ea2-b154-8b1611207513" (UID: "96443d8b-6798-4ea2-b154-8b1611207513"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 23:46:54.790942 containerd[1834]: time="2026-04-17T23:46:54.785470042Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:46:54.790942 containerd[1834]: time="2026-04-17T23:46:54.785552143Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:46:54.790942 containerd[1834]: time="2026-04-17T23:46:54.785595543Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:54.790942 containerd[1834]: time="2026-04-17T23:46:54.785731645Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:54.833506 containerd[1834]: time="2026-04-17T23:46:54.832970725Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:46:54.833506 containerd[1834]: time="2026-04-17T23:46:54.833077426Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:46:54.833506 containerd[1834]: time="2026-04-17T23:46:54.833101126Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:54.833506 containerd[1834]: time="2026-04-17T23:46:54.833217227Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:54.843206 kubelet[3392]: I0417 23:46:54.843158 3392 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96443d8b-6798-4ea2-b154-8b1611207513-whisker-ca-bundle\") on node \"ci-4081.3.6-n-7b570e9a3c\" DevicePath \"\"" Apr 17 23:46:54.843442 kubelet[3392]: I0417 23:46:54.843385 3392 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/96443d8b-6798-4ea2-b154-8b1611207513-nginx-config\") on node \"ci-4081.3.6-n-7b570e9a3c\" DevicePath \"\"" Apr 17 23:46:54.843442 kubelet[3392]: I0417 23:46:54.843406 3392 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cjjz6\" (UniqueName: \"kubernetes.io/projected/96443d8b-6798-4ea2-b154-8b1611207513-kube-api-access-cjjz6\") on node \"ci-4081.3.6-n-7b570e9a3c\" DevicePath \"\"" Apr 17 23:46:54.843645 kubelet[3392]: I0417 23:46:54.843422 3392 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/96443d8b-6798-4ea2-b154-8b1611207513-whisker-backend-key-pair\") on node \"ci-4081.3.6-n-7b570e9a3c\" DevicePath \"\"" Apr 17 23:46:54.860992 containerd[1834]: time="2026-04-17T23:46:54.860940651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-568984f78-z9wh9,Uid:8636a294-eef0-499d-a578-9b4ce7de9cb5,Namespace:calico-system,Attempt:1,} returns sandbox id \"20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e\"" Apr 17 23:46:54.864723 containerd[1834]: time="2026-04-17T23:46:54.864598780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dq7vb,Uid:4c94a196-84f1-4d7e-892d-1fd10da74241,Namespace:kube-system,Attempt:1,} returns sandbox id \"9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d\"" Apr 17 23:46:54.887522 containerd[1834]: time="2026-04-17T23:46:54.885776151Z" level=info msg="CreateContainer within sandbox \"9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 17 23:46:54.891687 containerd[1834]: time="2026-04-17T23:46:54.891653198Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-d6rv9,Uid:4c2d4922-a2ca-483d-b7db-77418f884573,Namespace:calico-system,Attempt:1,} returns sandbox id \"361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d\"" Apr 17 23:46:54.921214 containerd[1834]: time="2026-04-17T23:46:54.921174336Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-n7zwr,Uid:0607b5b7-e84f-4371-964e-db63a77e29d1,Namespace:kube-system,Attempt:1,} returns sandbox id \"f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650\"" Apr 17 23:46:54.931195 containerd[1834]: time="2026-04-17T23:46:54.931016915Z" level=info msg="CreateContainer within sandbox \"f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 17 23:46:54.935861 containerd[1834]: time="2026-04-17T23:46:54.935674453Z" level=info msg="CreateContainer within sandbox \"9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b14754ebe95e91ea846a0051248834da1b08977bc1fce0206ecf84a210ea3553\"" Apr 17 23:46:54.940432 containerd[1834]: time="2026-04-17T23:46:54.940058688Z" level=info msg="StartContainer for \"b14754ebe95e91ea846a0051248834da1b08977bc1fce0206ecf84a210ea3553\"" Apr 17 23:46:55.026248 containerd[1834]: time="2026-04-17T23:46:55.025157774Z" level=info msg="CreateContainer within sandbox \"f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4275e5e0eeb3ed48e464d41263326d78d6d7ee0162f313f111e7054a86a17fae\"" Apr 17 23:46:55.029463 containerd[1834]: time="2026-04-17T23:46:55.028144598Z" level=info msg="StartContainer for \"4275e5e0eeb3ed48e464d41263326d78d6d7ee0162f313f111e7054a86a17fae\"" Apr 17 23:46:55.078497 containerd[1834]: time="2026-04-17T23:46:55.078431303Z" level=info msg="StartContainer for \"b14754ebe95e91ea846a0051248834da1b08977bc1fce0206ecf84a210ea3553\" returns successfully" Apr 17 23:46:55.231466 kubelet[3392]: I0417 23:46:55.230859 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-dq7vb" podStartSLOduration=64.230838132 podStartE2EDuration="1m4.230838132s" podCreationTimestamp="2026-04-17 23:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 23:46:55.230110726 +0000 UTC m=+68.956076668" watchObservedRunningTime="2026-04-17 23:46:55.230838132 +0000 UTC m=+68.956804074" Apr 17 23:46:55.232341 systemd[1]: run-netns-cni\x2d5d20a5f7\x2d2c39\x2d2e84\x2de631\x2dcd547d94b8f7.mount: Deactivated successfully. Apr 17 23:46:55.236755 systemd[1]: var-lib-kubelet-pods-96443d8b\x2d6798\x2d4ea2\x2db154\x2d8b1611207513-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dcjjz6.mount: Deactivated successfully. Apr 17 23:46:55.237596 containerd[1834]: time="2026-04-17T23:46:55.237107182Z" level=info msg="StartContainer for \"4275e5e0eeb3ed48e464d41263326d78d6d7ee0162f313f111e7054a86a17fae\" returns successfully" Apr 17 23:46:55.237801 systemd[1]: var-lib-kubelet-pods-96443d8b\x2d6798\x2d4ea2\x2db154\x2d8b1611207513-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Apr 17 23:46:55.455106 kubelet[3392]: I0417 23:46:55.455055 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/5591c7f2-422e-4bc4-819a-54864dfa23c6-nginx-config\") pod \"whisker-69d4444588-wbw54\" (UID: \"5591c7f2-422e-4bc4-819a-54864dfa23c6\") " pod="calico-system/whisker-69d4444588-wbw54" Apr 17 23:46:55.455106 kubelet[3392]: I0417 23:46:55.455111 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5591c7f2-422e-4bc4-819a-54864dfa23c6-whisker-ca-bundle\") pod \"whisker-69d4444588-wbw54\" (UID: \"5591c7f2-422e-4bc4-819a-54864dfa23c6\") " pod="calico-system/whisker-69d4444588-wbw54" Apr 17 23:46:55.455468 kubelet[3392]: I0417 23:46:55.455137 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b952\" (UniqueName: \"kubernetes.io/projected/5591c7f2-422e-4bc4-819a-54864dfa23c6-kube-api-access-6b952\") pod \"whisker-69d4444588-wbw54\" (UID: \"5591c7f2-422e-4bc4-819a-54864dfa23c6\") " pod="calico-system/whisker-69d4444588-wbw54" Apr 17 23:46:55.455468 kubelet[3392]: I0417 23:46:55.455170 3392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5591c7f2-422e-4bc4-819a-54864dfa23c6-whisker-backend-key-pair\") pod \"whisker-69d4444588-wbw54\" (UID: \"5591c7f2-422e-4bc4-819a-54864dfa23c6\") " pod="calico-system/whisker-69d4444588-wbw54" Apr 17 23:46:55.508673 systemd-networkd[1395]: caliadaf23cd2dd: Gained IPv6LL Apr 17 23:46:55.684539 kernel: calico-node[5440]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Apr 17 23:46:55.698646 containerd[1834]: time="2026-04-17T23:46:55.698584785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-69d4444588-wbw54,Uid:5591c7f2-422e-4bc4-819a-54864dfa23c6,Namespace:calico-system,Attempt:0,}" Apr 17 23:46:55.892346 systemd-networkd[1395]: cali5288e1165cd: Gained IPv6LL Apr 17 23:46:55.947678 systemd-networkd[1395]: cali4e33bed721f: Link UP Apr 17 23:46:55.947924 systemd-networkd[1395]: cali4e33bed721f: Gained carrier Apr 17 23:46:55.955573 systemd-networkd[1395]: cali3e07900ecbf: Gained IPv6LL Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.831 [INFO][5502] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0 whisker-69d4444588- calico-system 5591c7f2-422e-4bc4-819a-54864dfa23c6 1055 0 2026-04-17 23:46:55 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:69d4444588 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081.3.6-n-7b570e9a3c whisker-69d4444588-wbw54 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali4e33bed721f [] [] }} ContainerID="72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" Namespace="calico-system" Pod="whisker-69d4444588-wbw54" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.831 [INFO][5502] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" Namespace="calico-system" Pod="whisker-69d4444588-wbw54" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.871 [INFO][5514] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" HandleID="k8s-pod-network.72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.879 [INFO][5514] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" HandleID="k8s-pod-network.72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ef7d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-7b570e9a3c", "pod":"whisker-69d4444588-wbw54", "timestamp":"2026-04-17 23:46:55.871342734 +0000 UTC"}, Hostname:"ci-4081.3.6-n-7b570e9a3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0000f7340)} Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.879 [INFO][5514] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.879 [INFO][5514] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.879 [INFO][5514] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-7b570e9a3c' Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.882 [INFO][5514] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.893 [INFO][5514] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.901 [INFO][5514] ipam/ipam.go 526: Trying affinity for 192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.906 [INFO][5514] ipam/ipam.go 160: Attempting to load block cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.908 [INFO][5514] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.909 [INFO][5514] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.911 [INFO][5514] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4 Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.927 [INFO][5514] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.935 [INFO][5514] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.71.6/26] block=192.168.71.0/26 handle="k8s-pod-network.72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.935 [INFO][5514] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.71.6/26] handle="k8s-pod-network.72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.935 [INFO][5514] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:46:55.966915 containerd[1834]: 2026-04-17 23:46:55.935 [INFO][5514] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.71.6/26] IPv6=[] ContainerID="72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" HandleID="k8s-pod-network.72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0" Apr 17 23:46:55.967944 containerd[1834]: 2026-04-17 23:46:55.940 [INFO][5502] cni-plugin/k8s.go 418: Populated endpoint ContainerID="72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" Namespace="calico-system" Pod="whisker-69d4444588-wbw54" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0", GenerateName:"whisker-69d4444588-", Namespace:"calico-system", SelfLink:"", UID:"5591c7f2-422e-4bc4-819a-54864dfa23c6", ResourceVersion:"1055", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"69d4444588", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"", Pod:"whisker-69d4444588-wbw54", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.71.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali4e33bed721f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:55.967944 containerd[1834]: 2026-04-17 23:46:55.940 [INFO][5502] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.6/32] ContainerID="72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" Namespace="calico-system" Pod="whisker-69d4444588-wbw54" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0" Apr 17 23:46:55.967944 containerd[1834]: 2026-04-17 23:46:55.940 [INFO][5502] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4e33bed721f ContainerID="72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" Namespace="calico-system" Pod="whisker-69d4444588-wbw54" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0" Apr 17 23:46:55.967944 containerd[1834]: 2026-04-17 23:46:55.946 [INFO][5502] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" Namespace="calico-system" Pod="whisker-69d4444588-wbw54" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0" Apr 17 23:46:55.967944 containerd[1834]: 2026-04-17 23:46:55.946 [INFO][5502] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" Namespace="calico-system" Pod="whisker-69d4444588-wbw54" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0", GenerateName:"whisker-69d4444588-", Namespace:"calico-system", SelfLink:"", UID:"5591c7f2-422e-4bc4-819a-54864dfa23c6", ResourceVersion:"1055", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"69d4444588", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4", Pod:"whisker-69d4444588-wbw54", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.71.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali4e33bed721f", MAC:"72:7f:aa:2f:86:df", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:46:55.967944 containerd[1834]: 2026-04-17 23:46:55.963 [INFO][5502] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4" Namespace="calico-system" Pod="whisker-69d4444588-wbw54" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--69d4444588--wbw54-eth0" Apr 17 23:46:56.013688 containerd[1834]: time="2026-04-17T23:46:56.011114449Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:46:56.013688 containerd[1834]: time="2026-04-17T23:46:56.011180650Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:46:56.013688 containerd[1834]: time="2026-04-17T23:46:56.011202250Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:56.013688 containerd[1834]: time="2026-04-17T23:46:56.012439262Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:46:56.019636 systemd-networkd[1395]: cali1bf13c3a279: Gained IPv6LL Apr 17 23:46:56.088241 containerd[1834]: time="2026-04-17T23:46:56.087884226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-69d4444588-wbw54,Uid:5591c7f2-422e-4bc4-819a-54864dfa23c6,Namespace:calico-system,Attempt:0,} returns sandbox id \"72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4\"" Apr 17 23:46:56.265586 kubelet[3392]: I0417 23:46:56.264283 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-n7zwr" podStartSLOduration=65.263778506 podStartE2EDuration="1m5.263778506s" podCreationTimestamp="2026-04-17 23:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 23:46:56.256366831 +0000 UTC m=+69.982332773" watchObservedRunningTime="2026-04-17 23:46:56.263778506 +0000 UTC m=+69.989744548" Apr 17 23:46:56.380159 kubelet[3392]: I0417 23:46:56.379197 3392 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96443d8b-6798-4ea2-b154-8b1611207513" path="/var/lib/kubelet/pods/96443d8b-6798-4ea2-b154-8b1611207513/volumes" Apr 17 23:46:56.505335 systemd-networkd[1395]: vxlan.calico: Link UP Apr 17 23:46:56.505343 systemd-networkd[1395]: vxlan.calico: Gained carrier Apr 17 23:46:56.531892 systemd-networkd[1395]: cali8d2ca757d64: Gained IPv6LL Apr 17 23:46:57.363634 systemd-networkd[1395]: cali4e33bed721f: Gained IPv6LL Apr 17 23:46:57.811666 systemd-networkd[1395]: vxlan.calico: Gained IPv6LL Apr 17 23:46:58.103157 containerd[1834]: time="2026-04-17T23:46:58.102865922Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:58.106781 containerd[1834]: time="2026-04-17T23:46:58.106612060Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Apr 17 23:46:58.110155 containerd[1834]: time="2026-04-17T23:46:58.110098595Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:58.115800 containerd[1834]: time="2026-04-17T23:46:58.115734252Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:58.116652 containerd[1834]: time="2026-04-17T23:46:58.116467760Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 3.377061589s" Apr 17 23:46:58.116652 containerd[1834]: time="2026-04-17T23:46:58.116530260Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Apr 17 23:46:58.117910 containerd[1834]: time="2026-04-17T23:46:58.117726873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 17 23:46:58.125122 containerd[1834]: time="2026-04-17T23:46:58.125093547Z" level=info msg="CreateContainer within sandbox \"afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 17 23:46:58.157074 containerd[1834]: time="2026-04-17T23:46:58.157023870Z" level=info msg="CreateContainer within sandbox \"afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"75307a308725f040794e385ae11a96043499676e0b157f4a92f49998c503cd73\"" Apr 17 23:46:58.157912 containerd[1834]: time="2026-04-17T23:46:58.157632076Z" level=info msg="StartContainer for \"75307a308725f040794e385ae11a96043499676e0b157f4a92f49998c503cd73\"" Apr 17 23:46:58.239416 containerd[1834]: time="2026-04-17T23:46:58.239321203Z" level=info msg="StartContainer for \"75307a308725f040794e385ae11a96043499676e0b157f4a92f49998c503cd73\" returns successfully" Apr 17 23:46:58.266514 kubelet[3392]: I0417 23:46:58.262916 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-568984f78-qbl55" podStartSLOduration=52.882181921 podStartE2EDuration="56.262895542s" podCreationTimestamp="2026-04-17 23:46:02 +0000 UTC" firstStartedPulling="2026-04-17 23:46:54.73683585 +0000 UTC m=+68.462801792" lastFinishedPulling="2026-04-17 23:46:58.117549371 +0000 UTC m=+71.843515413" observedRunningTime="2026-04-17 23:46:58.262634539 +0000 UTC m=+71.988600481" watchObservedRunningTime="2026-04-17 23:46:58.262895542 +0000 UTC m=+71.988861584" Apr 17 23:46:58.439596 containerd[1834]: time="2026-04-17T23:46:58.439445229Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:46:58.443180 containerd[1834]: time="2026-04-17T23:46:58.443129366Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Apr 17 23:46:58.445747 containerd[1834]: time="2026-04-17T23:46:58.445705992Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 327.944419ms" Apr 17 23:46:58.445875 containerd[1834]: time="2026-04-17T23:46:58.445763293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Apr 17 23:46:58.449602 containerd[1834]: time="2026-04-17T23:46:58.447718813Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Apr 17 23:46:58.454036 containerd[1834]: time="2026-04-17T23:46:58.454002576Z" level=info msg="CreateContainer within sandbox \"20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 17 23:46:58.488083 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1703902496.mount: Deactivated successfully. Apr 17 23:46:58.507369 containerd[1834]: time="2026-04-17T23:46:58.506376906Z" level=info msg="CreateContainer within sandbox \"20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"35dc3cafabffd6975b2ac7deadaf963cd4a12b6bbcbf4f031f73c6cca6289654\"" Apr 17 23:46:58.512275 containerd[1834]: time="2026-04-17T23:46:58.511154555Z" level=info msg="StartContainer for \"35dc3cafabffd6975b2ac7deadaf963cd4a12b6bbcbf4f031f73c6cca6289654\"" Apr 17 23:46:58.603509 containerd[1834]: time="2026-04-17T23:46:58.603124886Z" level=info msg="StartContainer for \"35dc3cafabffd6975b2ac7deadaf963cd4a12b6bbcbf4f031f73c6cca6289654\" returns successfully" Apr 17 23:46:59.286441 kubelet[3392]: I0417 23:46:59.286127 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-568984f78-z9wh9" podStartSLOduration=53.70439078 podStartE2EDuration="57.286103099s" podCreationTimestamp="2026-04-17 23:46:02 +0000 UTC" firstStartedPulling="2026-04-17 23:46:54.865188085 +0000 UTC m=+68.591154127" lastFinishedPulling="2026-04-17 23:46:58.446900404 +0000 UTC m=+72.172866446" observedRunningTime="2026-04-17 23:46:59.284768186 +0000 UTC m=+73.010734128" watchObservedRunningTime="2026-04-17 23:46:59.286103099 +0000 UTC m=+73.012069041" Apr 17 23:47:01.040613 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1782325499.mount: Deactivated successfully. Apr 17 23:47:01.586048 containerd[1834]: time="2026-04-17T23:47:01.585994779Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:01.589512 containerd[1834]: time="2026-04-17T23:47:01.589339513Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Apr 17 23:47:01.592553 containerd[1834]: time="2026-04-17T23:47:01.592496945Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:01.597194 containerd[1834]: time="2026-04-17T23:47:01.597085491Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:01.598310 containerd[1834]: time="2026-04-17T23:47:01.597859199Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 3.150105786s" Apr 17 23:47:01.598310 containerd[1834]: time="2026-04-17T23:47:01.597905300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Apr 17 23:47:01.600516 containerd[1834]: time="2026-04-17T23:47:01.599429215Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Apr 17 23:47:01.606628 containerd[1834]: time="2026-04-17T23:47:01.606597588Z" level=info msg="CreateContainer within sandbox \"361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Apr 17 23:47:01.636394 containerd[1834]: time="2026-04-17T23:47:01.636351489Z" level=info msg="CreateContainer within sandbox \"361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"fb621de497976cc9d96504cb3006ff8aafd6f331fc1ac0a213d1e76fe8764b20\"" Apr 17 23:47:01.638468 containerd[1834]: time="2026-04-17T23:47:01.637047396Z" level=info msg="StartContainer for \"fb621de497976cc9d96504cb3006ff8aafd6f331fc1ac0a213d1e76fe8764b20\"" Apr 17 23:47:01.721591 systemd[1]: run-containerd-runc-k8s.io-fb621de497976cc9d96504cb3006ff8aafd6f331fc1ac0a213d1e76fe8764b20-runc.IE4a17.mount: Deactivated successfully. Apr 17 23:47:01.828284 containerd[1834]: time="2026-04-17T23:47:01.828221431Z" level=info msg="StartContainer for \"fb621de497976cc9d96504cb3006ff8aafd6f331fc1ac0a213d1e76fe8764b20\" returns successfully" Apr 17 23:47:02.293106 kubelet[3392]: I0417 23:47:02.292566 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-5b85766d88-d6rv9" podStartSLOduration=53.587702139 podStartE2EDuration="1m0.292542131s" podCreationTimestamp="2026-04-17 23:46:02 +0000 UTC" firstStartedPulling="2026-04-17 23:46:54.894060818 +0000 UTC m=+68.620026760" lastFinishedPulling="2026-04-17 23:47:01.59890061 +0000 UTC m=+75.324866752" observedRunningTime="2026-04-17 23:47:02.287996485 +0000 UTC m=+76.013962527" watchObservedRunningTime="2026-04-17 23:47:02.292542131 +0000 UTC m=+76.018508073" Apr 17 23:47:03.073928 containerd[1834]: time="2026-04-17T23:47:03.073877140Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:03.076942 containerd[1834]: time="2026-04-17T23:47:03.076887870Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Apr 17 23:47:03.080928 containerd[1834]: time="2026-04-17T23:47:03.080772209Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:03.086245 containerd[1834]: time="2026-04-17T23:47:03.086196664Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:03.087458 containerd[1834]: time="2026-04-17T23:47:03.086911572Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.487449356s" Apr 17 23:47:03.087458 containerd[1834]: time="2026-04-17T23:47:03.086949172Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Apr 17 23:47:03.093887 containerd[1834]: time="2026-04-17T23:47:03.093860742Z" level=info msg="CreateContainer within sandbox \"72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Apr 17 23:47:03.129870 containerd[1834]: time="2026-04-17T23:47:03.129832706Z" level=info msg="CreateContainer within sandbox \"72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"781740046e92bf80d48161259e82f3d4d01b4509f0640029da9b6240d9341099\"" Apr 17 23:47:03.130576 containerd[1834]: time="2026-04-17T23:47:03.130388512Z" level=info msg="StartContainer for \"781740046e92bf80d48161259e82f3d4d01b4509f0640029da9b6240d9341099\"" Apr 17 23:47:03.203281 containerd[1834]: time="2026-04-17T23:47:03.203067047Z" level=info msg="StartContainer for \"781740046e92bf80d48161259e82f3d4d01b4509f0640029da9b6240d9341099\" returns successfully" Apr 17 23:47:03.205565 containerd[1834]: time="2026-04-17T23:47:03.205329370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Apr 17 23:47:04.373469 containerd[1834]: time="2026-04-17T23:47:04.372308554Z" level=info msg="StopPodSandbox for \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\"" Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.420 [INFO][5913] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.420 [INFO][5913] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" iface="eth0" netns="/var/run/netns/cni-002e4329-53c7-67fb-91c2-9b257fbbd896" Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.421 [INFO][5913] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" iface="eth0" netns="/var/run/netns/cni-002e4329-53c7-67fb-91c2-9b257fbbd896" Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.421 [INFO][5913] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" iface="eth0" netns="/var/run/netns/cni-002e4329-53c7-67fb-91c2-9b257fbbd896" Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.421 [INFO][5913] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.421 [INFO][5913] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.459 [INFO][5920] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" HandleID="k8s-pod-network.83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.459 [INFO][5920] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.459 [INFO][5920] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.469 [WARNING][5920] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" HandleID="k8s-pod-network.83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.469 [INFO][5920] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" HandleID="k8s-pod-network.83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.470 [INFO][5920] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:04.475519 containerd[1834]: 2026-04-17 23:47:04.472 [INFO][5913] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:04.476469 containerd[1834]: time="2026-04-17T23:47:04.476376365Z" level=info msg="TearDown network for sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\" successfully" Apr 17 23:47:04.476469 containerd[1834]: time="2026-04-17T23:47:04.476416965Z" level=info msg="StopPodSandbox for \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\" returns successfully" Apr 17 23:47:04.478592 systemd[1]: run-netns-cni\x2d002e4329\x2d53c7\x2d67fb\x2d91c2\x2d9b257fbbd896.mount: Deactivated successfully. Apr 17 23:47:04.479999 containerd[1834]: time="2026-04-17T23:47:04.478827800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7479bd7d89-zgl8b,Uid:c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f,Namespace:calico-system,Attempt:1,}" Apr 17 23:47:04.771391 systemd-networkd[1395]: calid92edb8d9a1: Link UP Apr 17 23:47:04.773009 systemd-networkd[1395]: calid92edb8d9a1: Gained carrier Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.555 [INFO][5928] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0 calico-kube-controllers-7479bd7d89- calico-system c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f 1126 0 2026-04-17 23:46:03 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7479bd7d89 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081.3.6-n-7b570e9a3c calico-kube-controllers-7479bd7d89-zgl8b eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calid92edb8d9a1 [] [] }} ContainerID="f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" Namespace="calico-system" Pod="calico-kube-controllers-7479bd7d89-zgl8b" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.556 [INFO][5928] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" Namespace="calico-system" Pod="calico-kube-controllers-7479bd7d89-zgl8b" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.594 [INFO][5941] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" HandleID="k8s-pod-network.f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.611 [INFO][5941] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" HandleID="k8s-pod-network.f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000261410), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-7b570e9a3c", "pod":"calico-kube-controllers-7479bd7d89-zgl8b", "timestamp":"2026-04-17 23:47:04.594638382 +0000 UTC"}, Hostname:"ci-4081.3.6-n-7b570e9a3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000212dc0)} Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.611 [INFO][5941] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.611 [INFO][5941] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.611 [INFO][5941] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-7b570e9a3c' Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.617 [INFO][5941] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.623 [INFO][5941] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.631 [INFO][5941] ipam/ipam.go 526: Trying affinity for 192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.640 [INFO][5941] ipam/ipam.go 160: Attempting to load block cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.646 [INFO][5941] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.646 [INFO][5941] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.649 [INFO][5941] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060 Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.708 [INFO][5941] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.756 [INFO][5941] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.71.7/26] block=192.168.71.0/26 handle="k8s-pod-network.f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.756 [INFO][5941] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.71.7/26] handle="k8s-pod-network.f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.757 [INFO][5941] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:04.817597 containerd[1834]: 2026-04-17 23:47:04.757 [INFO][5941] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.71.7/26] IPv6=[] ContainerID="f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" HandleID="k8s-pod-network.f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:04.819923 containerd[1834]: 2026-04-17 23:47:04.760 [INFO][5928] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" Namespace="calico-system" Pod="calico-kube-controllers-7479bd7d89-zgl8b" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0", GenerateName:"calico-kube-controllers-7479bd7d89-", Namespace:"calico-system", SelfLink:"", UID:"c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f", ResourceVersion:"1126", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7479bd7d89", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"", Pod:"calico-kube-controllers-7479bd7d89-zgl8b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.71.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid92edb8d9a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:04.819923 containerd[1834]: 2026-04-17 23:47:04.760 [INFO][5928] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.7/32] ContainerID="f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" Namespace="calico-system" Pod="calico-kube-controllers-7479bd7d89-zgl8b" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:04.819923 containerd[1834]: 2026-04-17 23:47:04.761 [INFO][5928] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid92edb8d9a1 ContainerID="f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" Namespace="calico-system" Pod="calico-kube-controllers-7479bd7d89-zgl8b" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:04.819923 containerd[1834]: 2026-04-17 23:47:04.768 [INFO][5928] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" Namespace="calico-system" Pod="calico-kube-controllers-7479bd7d89-zgl8b" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:04.819923 containerd[1834]: 2026-04-17 23:47:04.776 [INFO][5928] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" Namespace="calico-system" Pod="calico-kube-controllers-7479bd7d89-zgl8b" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0", GenerateName:"calico-kube-controllers-7479bd7d89-", Namespace:"calico-system", SelfLink:"", UID:"c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f", ResourceVersion:"1126", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7479bd7d89", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060", Pod:"calico-kube-controllers-7479bd7d89-zgl8b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.71.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid92edb8d9a1", MAC:"7e:8d:d0:8c:32:c1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:04.819923 containerd[1834]: 2026-04-17 23:47:04.812 [INFO][5928] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060" Namespace="calico-system" Pod="calico-kube-controllers-7479bd7d89-zgl8b" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:04.884450 containerd[1834]: time="2026-04-17T23:47:04.884307088Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:47:04.884450 containerd[1834]: time="2026-04-17T23:47:04.884402990Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:47:04.885374 containerd[1834]: time="2026-04-17T23:47:04.884456090Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:47:04.885374 containerd[1834]: time="2026-04-17T23:47:04.885188801Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:47:04.975052 containerd[1834]: time="2026-04-17T23:47:04.974992605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7479bd7d89-zgl8b,Uid:c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f,Namespace:calico-system,Attempt:1,} returns sandbox id \"f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060\"" Apr 17 23:47:05.535677 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3617636908.mount: Deactivated successfully. Apr 17 23:47:05.588185 containerd[1834]: time="2026-04-17T23:47:05.588130108Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:05.591389 containerd[1834]: time="2026-04-17T23:47:05.591211953Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Apr 17 23:47:05.594800 containerd[1834]: time="2026-04-17T23:47:05.594729804Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:05.642175 containerd[1834]: time="2026-04-17T23:47:05.642086892Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:05.643327 containerd[1834]: time="2026-04-17T23:47:05.642993505Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 2.437623234s" Apr 17 23:47:05.643327 containerd[1834]: time="2026-04-17T23:47:05.643041206Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Apr 17 23:47:05.645318 containerd[1834]: time="2026-04-17T23:47:05.644621529Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Apr 17 23:47:05.732172 containerd[1834]: time="2026-04-17T23:47:05.732121599Z" level=info msg="CreateContainer within sandbox \"72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Apr 17 23:47:06.040923 containerd[1834]: time="2026-04-17T23:47:06.040880983Z" level=info msg="CreateContainer within sandbox \"72ed44d83c31d812706379b6831a38bd2fb0764bab07a0ca3c762182420c90f4\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"d9a0e72bc72cb7c9ee939315268205e496fd32c2f84a1b1bc80184ffdd62936a\"" Apr 17 23:47:06.043049 containerd[1834]: time="2026-04-17T23:47:06.042769510Z" level=info msg="StartContainer for \"d9a0e72bc72cb7c9ee939315268205e496fd32c2f84a1b1bc80184ffdd62936a\"" Apr 17 23:47:06.165862 containerd[1834]: time="2026-04-17T23:47:06.165788896Z" level=info msg="StartContainer for \"d9a0e72bc72cb7c9ee939315268205e496fd32c2f84a1b1bc80184ffdd62936a\" returns successfully" Apr 17 23:47:06.298049 kubelet[3392]: I0417 23:47:06.297877 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-69d4444588-wbw54" podStartSLOduration=1.743473136 podStartE2EDuration="11.297853214s" podCreationTimestamp="2026-04-17 23:46:55 +0000 UTC" firstStartedPulling="2026-04-17 23:46:56.090001147 +0000 UTC m=+69.815967089" lastFinishedPulling="2026-04-17 23:47:05.644381225 +0000 UTC m=+79.370347167" observedRunningTime="2026-04-17 23:47:06.297734312 +0000 UTC m=+80.023700354" watchObservedRunningTime="2026-04-17 23:47:06.297853214 +0000 UTC m=+80.023819256" Apr 17 23:47:06.373829 containerd[1834]: time="2026-04-17T23:47:06.373594414Z" level=info msg="StopPodSandbox for \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\"" Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.426 [INFO][6061] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.426 [INFO][6061] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" iface="eth0" netns="/var/run/netns/cni-0114d2cb-8612-ca03-2285-8137b82ee424" Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.426 [INFO][6061] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" iface="eth0" netns="/var/run/netns/cni-0114d2cb-8612-ca03-2285-8137b82ee424" Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.427 [INFO][6061] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" iface="eth0" netns="/var/run/netns/cni-0114d2cb-8612-ca03-2285-8137b82ee424" Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.427 [INFO][6061] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.427 [INFO][6061] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.453 [INFO][6068] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" HandleID="k8s-pod-network.f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.454 [INFO][6068] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.454 [INFO][6068] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.463 [WARNING][6068] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" HandleID="k8s-pod-network.f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.463 [INFO][6068] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" HandleID="k8s-pod-network.f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.464 [INFO][6068] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:06.467340 containerd[1834]: 2026-04-17 23:47:06.465 [INFO][6061] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:06.468217 containerd[1834]: time="2026-04-17T23:47:06.467542078Z" level=info msg="TearDown network for sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\" successfully" Apr 17 23:47:06.468217 containerd[1834]: time="2026-04-17T23:47:06.467578078Z" level=info msg="StopPodSandbox for \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\" returns successfully" Apr 17 23:47:06.468614 containerd[1834]: time="2026-04-17T23:47:06.468588193Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdzpd,Uid:4658a59c-dd8b-4cd9-bac7-09b6c58f7e83,Namespace:calico-system,Attempt:1,}" Apr 17 23:47:06.478078 systemd[1]: run-netns-cni\x2d0114d2cb\x2d8612\x2dca03\x2d2285\x2d8137b82ee424.mount: Deactivated successfully. Apr 17 23:47:06.771752 systemd-networkd[1395]: calid92edb8d9a1: Gained IPv6LL Apr 17 23:47:07.182120 systemd-networkd[1395]: calid0a0ee4ce27: Link UP Apr 17 23:47:07.183579 systemd-networkd[1395]: calid0a0ee4ce27: Gained carrier Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.114 [INFO][6079] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0 csi-node-driver- calico-system 4658a59c-dd8b-4cd9-bac7-09b6c58f7e83 1153 0 2026-04-17 23:46:03 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6d9d697c7c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081.3.6-n-7b570e9a3c csi-node-driver-kdzpd eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid0a0ee4ce27 [] [] }} ContainerID="2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" Namespace="calico-system" Pod="csi-node-driver-kdzpd" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.114 [INFO][6079] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" Namespace="calico-system" Pod="csi-node-driver-kdzpd" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.139 [INFO][6091] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" HandleID="k8s-pod-network.2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.147 [INFO][6091] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" HandleID="k8s-pod-network.2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277ac0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081.3.6-n-7b570e9a3c", "pod":"csi-node-driver-kdzpd", "timestamp":"2026-04-17 23:47:07.139750439 +0000 UTC"}, Hostname:"ci-4081.3.6-n-7b570e9a3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001122c0)} Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.147 [INFO][6091] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.147 [INFO][6091] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.147 [INFO][6091] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081.3.6-n-7b570e9a3c' Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.149 [INFO][6091] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.153 [INFO][6091] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.156 [INFO][6091] ipam/ipam.go 526: Trying affinity for 192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.158 [INFO][6091] ipam/ipam.go 160: Attempting to load block cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.161 [INFO][6091] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.161 [INFO][6091] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.162 [INFO][6091] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.166 [INFO][6091] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.176 [INFO][6091] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.71.8/26] block=192.168.71.0/26 handle="k8s-pod-network.2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.176 [INFO][6091] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.71.8/26] handle="k8s-pod-network.2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" host="ci-4081.3.6-n-7b570e9a3c" Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.176 [INFO][6091] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:07.200911 containerd[1834]: 2026-04-17 23:47:07.176 [INFO][6091] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.71.8/26] IPv6=[] ContainerID="2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" HandleID="k8s-pod-network.2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:07.202211 containerd[1834]: 2026-04-17 23:47:07.178 [INFO][6079] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" Namespace="calico-system" Pod="csi-node-driver-kdzpd" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4658a59c-dd8b-4cd9-bac7-09b6c58f7e83", ResourceVersion:"1153", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"", Pod:"csi-node-driver-kdzpd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.71.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid0a0ee4ce27", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:07.202211 containerd[1834]: 2026-04-17 23:47:07.178 [INFO][6079] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.8/32] ContainerID="2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" Namespace="calico-system" Pod="csi-node-driver-kdzpd" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:07.202211 containerd[1834]: 2026-04-17 23:47:07.178 [INFO][6079] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid0a0ee4ce27 ContainerID="2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" Namespace="calico-system" Pod="csi-node-driver-kdzpd" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:07.202211 containerd[1834]: 2026-04-17 23:47:07.182 [INFO][6079] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" Namespace="calico-system" Pod="csi-node-driver-kdzpd" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:07.202211 containerd[1834]: 2026-04-17 23:47:07.183 [INFO][6079] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" Namespace="calico-system" Pod="csi-node-driver-kdzpd" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4658a59c-dd8b-4cd9-bac7-09b6c58f7e83", ResourceVersion:"1153", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f", Pod:"csi-node-driver-kdzpd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.71.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid0a0ee4ce27", MAC:"c2:7c:6b:dc:44:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:07.202211 containerd[1834]: 2026-04-17 23:47:07.197 [INFO][6079] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f" Namespace="calico-system" Pod="csi-node-driver-kdzpd" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:07.230867 containerd[1834]: time="2026-04-17T23:47:07.230516057Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 17 23:47:07.230867 containerd[1834]: time="2026-04-17T23:47:07.230577558Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 17 23:47:07.230867 containerd[1834]: time="2026-04-17T23:47:07.230613758Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:47:07.230867 containerd[1834]: time="2026-04-17T23:47:07.230712660Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 17 23:47:07.308329 containerd[1834]: time="2026-04-17T23:47:07.308272786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdzpd,Uid:4658a59c-dd8b-4cd9-bac7-09b6c58f7e83,Namespace:calico-system,Attempt:1,} returns sandbox id \"2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f\"" Apr 17 23:47:08.947828 systemd-networkd[1395]: calid0a0ee4ce27: Gained IPv6LL Apr 17 23:47:11.411249 containerd[1834]: time="2026-04-17T23:47:11.411198137Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:11.414156 containerd[1834]: time="2026-04-17T23:47:11.414091467Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Apr 17 23:47:11.417731 containerd[1834]: time="2026-04-17T23:47:11.417672705Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:11.422476 containerd[1834]: time="2026-04-17T23:47:11.422417755Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:11.423728 containerd[1834]: time="2026-04-17T23:47:11.423105162Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 5.778446033s" Apr 17 23:47:11.423728 containerd[1834]: time="2026-04-17T23:47:11.423145762Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Apr 17 23:47:11.424409 containerd[1834]: time="2026-04-17T23:47:11.424382475Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Apr 17 23:47:11.456336 containerd[1834]: time="2026-04-17T23:47:11.456293909Z" level=info msg="CreateContainer within sandbox \"f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Apr 17 23:47:11.497709 containerd[1834]: time="2026-04-17T23:47:11.497667543Z" level=info msg="CreateContainer within sandbox \"f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"706140ee5d798cb285cc2d642fa46c011a0c35cbf56d53220b32eace02cc357b\"" Apr 17 23:47:11.499461 containerd[1834]: time="2026-04-17T23:47:11.498252349Z" level=info msg="StartContainer for \"706140ee5d798cb285cc2d642fa46c011a0c35cbf56d53220b32eace02cc357b\"" Apr 17 23:47:11.573248 containerd[1834]: time="2026-04-17T23:47:11.573192234Z" level=info msg="StartContainer for \"706140ee5d798cb285cc2d642fa46c011a0c35cbf56d53220b32eace02cc357b\" returns successfully" Apr 17 23:47:12.326090 kubelet[3392]: I0417 23:47:12.326017 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7479bd7d89-zgl8b" podStartSLOduration=62.87812097 podStartE2EDuration="1m9.325982219s" podCreationTimestamp="2026-04-17 23:46:03 +0000 UTC" firstStartedPulling="2026-04-17 23:47:04.976390625 +0000 UTC m=+78.702356567" lastFinishedPulling="2026-04-17 23:47:11.424251874 +0000 UTC m=+85.150217816" observedRunningTime="2026-04-17 23:47:12.322163579 +0000 UTC m=+86.048129621" watchObservedRunningTime="2026-04-17 23:47:12.325982219 +0000 UTC m=+86.051948161" Apr 17 23:47:12.999457 containerd[1834]: time="2026-04-17T23:47:12.999398073Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:13.002157 containerd[1834]: time="2026-04-17T23:47:13.001990600Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Apr 17 23:47:13.004983 containerd[1834]: time="2026-04-17T23:47:13.004953131Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:13.009320 containerd[1834]: time="2026-04-17T23:47:13.009259676Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:13.010472 containerd[1834]: time="2026-04-17T23:47:13.009917583Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 1.585496008s" Apr 17 23:47:13.010472 containerd[1834]: time="2026-04-17T23:47:13.009954683Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Apr 17 23:47:13.016856 containerd[1834]: time="2026-04-17T23:47:13.016828055Z" level=info msg="CreateContainer within sandbox \"2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Apr 17 23:47:13.052284 containerd[1834]: time="2026-04-17T23:47:13.052239926Z" level=info msg="CreateContainer within sandbox \"2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"0688928574d2714e30376193c427888170f61be36c02cb7ef7944eeb550d81a5\"" Apr 17 23:47:13.054357 containerd[1834]: time="2026-04-17T23:47:13.052966834Z" level=info msg="StartContainer for \"0688928574d2714e30376193c427888170f61be36c02cb7ef7944eeb550d81a5\"" Apr 17 23:47:13.134034 containerd[1834]: time="2026-04-17T23:47:13.134000182Z" level=info msg="StartContainer for \"0688928574d2714e30376193c427888170f61be36c02cb7ef7944eeb550d81a5\" returns successfully" Apr 17 23:47:13.135729 containerd[1834]: time="2026-04-17T23:47:13.135697100Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Apr 17 23:47:14.913871 containerd[1834]: time="2026-04-17T23:47:14.913822825Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:14.918702 containerd[1834]: time="2026-04-17T23:47:14.918495074Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Apr 17 23:47:14.923704 containerd[1834]: time="2026-04-17T23:47:14.923648228Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:14.929094 containerd[1834]: time="2026-04-17T23:47:14.929040884Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 17 23:47:14.929912 containerd[1834]: time="2026-04-17T23:47:14.929767492Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 1.794028092s" Apr 17 23:47:14.929912 containerd[1834]: time="2026-04-17T23:47:14.929807492Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Apr 17 23:47:14.938014 containerd[1834]: time="2026-04-17T23:47:14.937964878Z" level=info msg="CreateContainer within sandbox \"2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Apr 17 23:47:14.992185 containerd[1834]: time="2026-04-17T23:47:14.992134045Z" level=info msg="CreateContainer within sandbox \"2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"ca3fb64c72069469254ef5580cedb7730c43ed11e0ebe02c4b9f59c7f1db38a7\"" Apr 17 23:47:14.992866 containerd[1834]: time="2026-04-17T23:47:14.992762252Z" level=info msg="StartContainer for \"ca3fb64c72069469254ef5580cedb7730c43ed11e0ebe02c4b9f59c7f1db38a7\"" Apr 17 23:47:15.059978 containerd[1834]: time="2026-04-17T23:47:15.059795654Z" level=info msg="StartContainer for \"ca3fb64c72069469254ef5580cedb7730c43ed11e0ebe02c4b9f59c7f1db38a7\" returns successfully" Apr 17 23:47:15.330855 kubelet[3392]: I0417 23:47:15.330677 3392 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-kdzpd" podStartSLOduration=64.709295492 podStartE2EDuration="1m12.33055409s" podCreationTimestamp="2026-04-17 23:46:03 +0000 UTC" firstStartedPulling="2026-04-17 23:47:07.309567005 +0000 UTC m=+81.035532947" lastFinishedPulling="2026-04-17 23:47:14.930825603 +0000 UTC m=+88.656791545" observedRunningTime="2026-04-17 23:47:15.328726971 +0000 UTC m=+89.054693013" watchObservedRunningTime="2026-04-17 23:47:15.33055409 +0000 UTC m=+89.056520132" Apr 17 23:47:16.039552 kubelet[3392]: I0417 23:47:16.039514 3392 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Apr 17 23:47:16.039552 kubelet[3392]: I0417 23:47:16.039563 3392 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Apr 17 23:47:34.618327 systemd[1]: Started sshd@7-10.0.0.10:22-20.229.252.112:37110.service - OpenSSH per-connection server daemon (20.229.252.112:37110). Apr 17 23:47:34.750873 sshd[6408]: Accepted publickey for core from 20.229.252.112 port 37110 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:47:34.759374 sshd[6408]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:47:34.767723 systemd-logind[1800]: New session 10 of user core. Apr 17 23:47:34.772821 systemd[1]: Started session-10.scope - Session 10 of User core. Apr 17 23:47:35.067501 sshd[6408]: pam_unix(sshd:session): session closed for user core Apr 17 23:47:35.071912 systemd[1]: sshd@7-10.0.0.10:22-20.229.252.112:37110.service: Deactivated successfully. Apr 17 23:47:35.079655 systemd[1]: session-10.scope: Deactivated successfully. Apr 17 23:47:35.081313 systemd-logind[1800]: Session 10 logged out. Waiting for processes to exit. Apr 17 23:47:35.086974 systemd-logind[1800]: Removed session 10. Apr 17 23:47:40.092798 systemd[1]: Started sshd@8-10.0.0.10:22-20.229.252.112:55288.service - OpenSSH per-connection server daemon (20.229.252.112:55288). Apr 17 23:47:40.206839 sshd[6435]: Accepted publickey for core from 20.229.252.112 port 55288 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:47:40.209068 sshd[6435]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:47:40.214132 systemd-logind[1800]: New session 11 of user core. Apr 17 23:47:40.219918 systemd[1]: Started session-11.scope - Session 11 of User core. Apr 17 23:47:40.375346 sshd[6435]: pam_unix(sshd:session): session closed for user core Apr 17 23:47:40.380033 systemd[1]: sshd@8-10.0.0.10:22-20.229.252.112:55288.service: Deactivated successfully. Apr 17 23:47:40.385488 systemd[1]: session-11.scope: Deactivated successfully. Apr 17 23:47:40.386461 systemd-logind[1800]: Session 11 logged out. Waiting for processes to exit. Apr 17 23:47:40.387423 systemd-logind[1800]: Removed session 11. Apr 17 23:47:42.331841 systemd[1]: run-containerd-runc-k8s.io-706140ee5d798cb285cc2d642fa46c011a0c35cbf56d53220b32eace02cc357b-runc.QZSPIN.mount: Deactivated successfully. Apr 17 23:47:45.400829 systemd[1]: Started sshd@9-10.0.0.10:22-20.229.252.112:57376.service - OpenSSH per-connection server daemon (20.229.252.112:57376). Apr 17 23:47:45.515116 sshd[6468]: Accepted publickey for core from 20.229.252.112 port 57376 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:47:45.516738 sshd[6468]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:47:45.521944 systemd-logind[1800]: New session 12 of user core. Apr 17 23:47:45.525101 systemd[1]: Started session-12.scope - Session 12 of User core. Apr 17 23:47:45.685968 sshd[6468]: pam_unix(sshd:session): session closed for user core Apr 17 23:47:45.689549 systemd[1]: sshd@9-10.0.0.10:22-20.229.252.112:57376.service: Deactivated successfully. Apr 17 23:47:45.695666 systemd[1]: session-12.scope: Deactivated successfully. Apr 17 23:47:45.696814 systemd-logind[1800]: Session 12 logged out. Waiting for processes to exit. Apr 17 23:47:45.697870 systemd-logind[1800]: Removed session 12. Apr 17 23:47:50.709971 systemd[1]: Started sshd@10-10.0.0.10:22-20.229.252.112:57380.service - OpenSSH per-connection server daemon (20.229.252.112:57380). Apr 17 23:47:50.823521 sshd[6491]: Accepted publickey for core from 20.229.252.112 port 57380 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:47:50.824925 sshd[6491]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:47:50.830001 systemd-logind[1800]: New session 13 of user core. Apr 17 23:47:50.833978 systemd[1]: Started session-13.scope - Session 13 of User core. Apr 17 23:47:50.994298 sshd[6491]: pam_unix(sshd:session): session closed for user core Apr 17 23:47:50.998526 containerd[1834]: time="2026-04-17T23:47:50.998299264Z" level=info msg="StopPodSandbox for \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\"" Apr 17 23:47:50.999637 systemd[1]: sshd@10-10.0.0.10:22-20.229.252.112:57380.service: Deactivated successfully. Apr 17 23:47:51.008120 systemd-logind[1800]: Session 13 logged out. Waiting for processes to exit. Apr 17 23:47:51.008972 systemd[1]: session-13.scope: Deactivated successfully. Apr 17 23:47:51.011025 systemd-logind[1800]: Removed session 13. Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.040 [WARNING][6516] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0", GenerateName:"calico-apiserver-568984f78-", Namespace:"calico-system", SelfLink:"", UID:"499d7392-3858-4796-8ae5-2602f95a1e6c", ResourceVersion:"1101", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"568984f78", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3", Pod:"calico-apiserver-568984f78-qbl55", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliadaf23cd2dd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.041 [INFO][6516] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.041 [INFO][6516] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" iface="eth0" netns="" Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.041 [INFO][6516] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.041 [INFO][6516] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.063 [INFO][6523] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" HandleID="k8s-pod-network.61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.064 [INFO][6523] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.064 [INFO][6523] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.070 [WARNING][6523] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" HandleID="k8s-pod-network.61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.070 [INFO][6523] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" HandleID="k8s-pod-network.61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.071 [INFO][6523] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:51.073780 containerd[1834]: 2026-04-17 23:47:51.072 [INFO][6516] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:47:51.074385 containerd[1834]: time="2026-04-17T23:47:51.074350926Z" level=info msg="TearDown network for sandbox \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\" successfully" Apr 17 23:47:51.074449 containerd[1834]: time="2026-04-17T23:47:51.074423028Z" level=info msg="StopPodSandbox for \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\" returns successfully" Apr 17 23:47:51.075281 containerd[1834]: time="2026-04-17T23:47:51.075237755Z" level=info msg="RemovePodSandbox for \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\"" Apr 17 23:47:51.075395 containerd[1834]: time="2026-04-17T23:47:51.075330358Z" level=info msg="Forcibly stopping sandbox \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\"" Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.108 [WARNING][6539] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0", GenerateName:"calico-apiserver-568984f78-", Namespace:"calico-system", SelfLink:"", UID:"499d7392-3858-4796-8ae5-2602f95a1e6c", ResourceVersion:"1101", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"568984f78", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"afdcccff0658b909b964d5e78e62e613398a84a1c78e31e6f222064f25c410e3", Pod:"calico-apiserver-568984f78-qbl55", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"caliadaf23cd2dd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.108 [INFO][6539] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.108 [INFO][6539] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" iface="eth0" netns="" Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.108 [INFO][6539] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.108 [INFO][6539] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.130 [INFO][6546] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" HandleID="k8s-pod-network.61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.130 [INFO][6546] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.130 [INFO][6546] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.136 [WARNING][6546] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" HandleID="k8s-pod-network.61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.136 [INFO][6546] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" HandleID="k8s-pod-network.61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--qbl55-eth0" Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.139 [INFO][6546] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:51.143068 containerd[1834]: 2026-04-17 23:47:51.141 [INFO][6539] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6" Apr 17 23:47:51.143912 containerd[1834]: time="2026-04-17T23:47:51.143118952Z" level=info msg="TearDown network for sandbox \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\" successfully" Apr 17 23:47:51.170583 containerd[1834]: time="2026-04-17T23:47:51.170523140Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 17 23:47:51.170757 containerd[1834]: time="2026-04-17T23:47:51.170635043Z" level=info msg="RemovePodSandbox \"61f8c77427241eebf6cf9dcfcca2b8b9819197e31f16bf933a0aa3dced9b2ad6\" returns successfully" Apr 17 23:47:51.171316 containerd[1834]: time="2026-04-17T23:47:51.171273764Z" level=info msg="StopPodSandbox for \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\"" Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.205 [WARNING][6560] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4658a59c-dd8b-4cd9-bac7-09b6c58f7e83", ResourceVersion:"1192", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f", Pod:"csi-node-driver-kdzpd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.71.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid0a0ee4ce27", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.205 [INFO][6560] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.205 [INFO][6560] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" iface="eth0" netns="" Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.205 [INFO][6560] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.205 [INFO][6560] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.226 [INFO][6567] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" HandleID="k8s-pod-network.f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.226 [INFO][6567] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.226 [INFO][6567] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.233 [WARNING][6567] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" HandleID="k8s-pod-network.f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.233 [INFO][6567] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" HandleID="k8s-pod-network.f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.234 [INFO][6567] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:51.237111 containerd[1834]: 2026-04-17 23:47:51.235 [INFO][6560] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:51.237845 containerd[1834]: time="2026-04-17T23:47:51.237157597Z" level=info msg="TearDown network for sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\" successfully" Apr 17 23:47:51.237845 containerd[1834]: time="2026-04-17T23:47:51.237187898Z" level=info msg="StopPodSandbox for \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\" returns successfully" Apr 17 23:47:51.238671 containerd[1834]: time="2026-04-17T23:47:51.238072227Z" level=info msg="RemovePodSandbox for \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\"" Apr 17 23:47:51.238671 containerd[1834]: time="2026-04-17T23:47:51.238113428Z" level=info msg="Forcibly stopping sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\"" Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.279 [WARNING][6582] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4658a59c-dd8b-4cd9-bac7-09b6c58f7e83", ResourceVersion:"1192", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"2b068cd3dadaabf022a1676bb53392811e7fa09f28ba118d070f25caf3d6041f", Pod:"csi-node-driver-kdzpd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.71.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid0a0ee4ce27", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.280 [INFO][6582] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.280 [INFO][6582] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" iface="eth0" netns="" Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.280 [INFO][6582] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.280 [INFO][6582] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.307 [INFO][6589] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" HandleID="k8s-pod-network.f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.307 [INFO][6589] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.307 [INFO][6589] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.314 [WARNING][6589] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" HandleID="k8s-pod-network.f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.314 [INFO][6589] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" HandleID="k8s-pod-network.f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-csi--node--driver--kdzpd-eth0" Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.316 [INFO][6589] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:51.318704 containerd[1834]: 2026-04-17 23:47:51.317 [INFO][6582] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b" Apr 17 23:47:51.319389 containerd[1834]: time="2026-04-17T23:47:51.318753439Z" level=info msg="TearDown network for sandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\" successfully" Apr 17 23:47:51.333613 containerd[1834]: time="2026-04-17T23:47:51.333394713Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 17 23:47:51.333613 containerd[1834]: time="2026-04-17T23:47:51.333508117Z" level=info msg="RemovePodSandbox \"f9e066fdf0793900514dfc2314700c117fb535312d818f5c191e0c164756c75b\" returns successfully" Apr 17 23:47:51.334144 containerd[1834]: time="2026-04-17T23:47:51.334115036Z" level=info msg="StopPodSandbox for \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\"" Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.369 [WARNING][6603] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0", GenerateName:"calico-kube-controllers-7479bd7d89-", Namespace:"calico-system", SelfLink:"", UID:"c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f", ResourceVersion:"1177", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7479bd7d89", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060", Pod:"calico-kube-controllers-7479bd7d89-zgl8b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.71.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid92edb8d9a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.369 [INFO][6603] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.369 [INFO][6603] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" iface="eth0" netns="" Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.369 [INFO][6603] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.369 [INFO][6603] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.395 [INFO][6610] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" HandleID="k8s-pod-network.83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.395 [INFO][6610] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.395 [INFO][6610] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.401 [WARNING][6610] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" HandleID="k8s-pod-network.83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.401 [INFO][6610] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" HandleID="k8s-pod-network.83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.402 [INFO][6610] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:51.404961 containerd[1834]: 2026-04-17 23:47:51.403 [INFO][6603] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:51.405965 containerd[1834]: time="2026-04-17T23:47:51.405011032Z" level=info msg="TearDown network for sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\" successfully" Apr 17 23:47:51.405965 containerd[1834]: time="2026-04-17T23:47:51.405040633Z" level=info msg="StopPodSandbox for \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\" returns successfully" Apr 17 23:47:51.405965 containerd[1834]: time="2026-04-17T23:47:51.405658753Z" level=info msg="RemovePodSandbox for \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\"" Apr 17 23:47:51.405965 containerd[1834]: time="2026-04-17T23:47:51.405695154Z" level=info msg="Forcibly stopping sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\"" Apr 17 23:47:51.515748 systemd[1]: run-containerd-runc-k8s.io-706140ee5d798cb285cc2d642fa46c011a0c35cbf56d53220b32eace02cc357b-runc.UjXITl.mount: Deactivated successfully. Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.451 [WARNING][6624] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0", GenerateName:"calico-kube-controllers-7479bd7d89-", Namespace:"calico-system", SelfLink:"", UID:"c6a1e0ff-d681-42ba-8cbb-c8e3f8849e1f", ResourceVersion:"1177", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7479bd7d89", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"f180c78efe211eeb00e5a8675073f51b52fc2f4688c1ce4bb527a1a0386a6060", Pod:"calico-kube-controllers-7479bd7d89-zgl8b", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.71.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid92edb8d9a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.452 [INFO][6624] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.452 [INFO][6624] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" iface="eth0" netns="" Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.452 [INFO][6624] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.452 [INFO][6624] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.516 [INFO][6631] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" HandleID="k8s-pod-network.83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.516 [INFO][6631] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.518 [INFO][6631] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.528 [WARNING][6631] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" HandleID="k8s-pod-network.83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.528 [INFO][6631] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" HandleID="k8s-pod-network.83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--kube--controllers--7479bd7d89--zgl8b-eth0" Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.530 [INFO][6631] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:51.533879 containerd[1834]: 2026-04-17 23:47:51.532 [INFO][6624] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316" Apr 17 23:47:51.537093 containerd[1834]: time="2026-04-17T23:47:51.533941876Z" level=info msg="TearDown network for sandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\" successfully" Apr 17 23:47:51.548557 containerd[1834]: time="2026-04-17T23:47:51.548509168Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 17 23:47:51.549704 containerd[1834]: time="2026-04-17T23:47:51.549676400Z" level=info msg="RemovePodSandbox \"83deb6bfb5c72f63a8fcb193d3bee315ea0a6af1ebd6a9fb9ae8c9bed2ea9316\" returns successfully" Apr 17 23:47:51.550350 containerd[1834]: time="2026-04-17T23:47:51.550325017Z" level=info msg="StopPodSandbox for \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\"" Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.616 [WARNING][6672] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4c94a196-84f1-4d7e-892d-1fd10da74241", ResourceVersion:"1036", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 45, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d", Pod:"coredns-674b8bbfcf-dq7vb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8d2ca757d64", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.616 [INFO][6672] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.616 [INFO][6672] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" iface="eth0" netns="" Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.616 [INFO][6672] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.616 [INFO][6672] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.651 [INFO][6690] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" HandleID="k8s-pod-network.1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.651 [INFO][6690] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.651 [INFO][6690] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.657 [WARNING][6690] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" HandleID="k8s-pod-network.1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.657 [INFO][6690] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" HandleID="k8s-pod-network.1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.658 [INFO][6690] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:51.661414 containerd[1834]: 2026-04-17 23:47:51.660 [INFO][6672] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:47:51.661414 containerd[1834]: time="2026-04-17T23:47:51.661377305Z" level=info msg="TearDown network for sandbox \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\" successfully" Apr 17 23:47:51.661414 containerd[1834]: time="2026-04-17T23:47:51.661414606Z" level=info msg="StopPodSandbox for \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\" returns successfully" Apr 17 23:47:51.664271 containerd[1834]: time="2026-04-17T23:47:51.663549264Z" level=info msg="RemovePodSandbox for \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\"" Apr 17 23:47:51.664271 containerd[1834]: time="2026-04-17T23:47:51.663596865Z" level=info msg="Forcibly stopping sandbox \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\"" Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.698 [WARNING][6705] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4c94a196-84f1-4d7e-892d-1fd10da74241", ResourceVersion:"1036", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 45, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"9fb573d97ef657a73498b30439a7b9ce040564d8f4ab028a90397514be41f62d", Pod:"coredns-674b8bbfcf-dq7vb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8d2ca757d64", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.699 [INFO][6705] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.699 [INFO][6705] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" iface="eth0" netns="" Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.699 [INFO][6705] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.699 [INFO][6705] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.722 [INFO][6712] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" HandleID="k8s-pod-network.1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.722 [INFO][6712] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.722 [INFO][6712] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.728 [WARNING][6712] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" HandleID="k8s-pod-network.1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.728 [INFO][6712] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" HandleID="k8s-pod-network.1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--dq7vb-eth0" Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.730 [INFO][6712] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:51.732699 containerd[1834]: 2026-04-17 23:47:51.731 [INFO][6705] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b" Apr 17 23:47:51.732699 containerd[1834]: time="2026-04-17T23:47:51.732666224Z" level=info msg="TearDown network for sandbox \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\" successfully" Apr 17 23:47:51.740898 containerd[1834]: time="2026-04-17T23:47:51.740821443Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 17 23:47:51.741375 containerd[1834]: time="2026-04-17T23:47:51.740929946Z" level=info msg="RemovePodSandbox \"1302e55ca0e7dce85f13e4d756292413493ef50ee486bdc75287349d3425425b\" returns successfully" Apr 17 23:47:51.741564 containerd[1834]: time="2026-04-17T23:47:51.741533562Z" level=info msg="StopPodSandbox for \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\"" Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.777 [WARNING][6726] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--655df5d5f--wr76l-eth0" Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.777 [INFO][6726] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.777 [INFO][6726] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" iface="eth0" netns="" Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.777 [INFO][6726] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.777 [INFO][6726] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.806 [INFO][6733] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" HandleID="k8s-pod-network.e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--655df5d5f--wr76l-eth0" Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.806 [INFO][6733] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.806 [INFO][6733] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.812 [WARNING][6733] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" HandleID="k8s-pod-network.e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--655df5d5f--wr76l-eth0" Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.812 [INFO][6733] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" HandleID="k8s-pod-network.e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--655df5d5f--wr76l-eth0" Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.815 [INFO][6733] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:51.818208 containerd[1834]: 2026-04-17 23:47:51.816 [INFO][6726] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:47:51.818208 containerd[1834]: time="2026-04-17T23:47:51.818046321Z" level=info msg="TearDown network for sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\" successfully" Apr 17 23:47:51.818208 containerd[1834]: time="2026-04-17T23:47:51.818079822Z" level=info msg="StopPodSandbox for \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\" returns successfully" Apr 17 23:47:51.820682 containerd[1834]: time="2026-04-17T23:47:51.819707066Z" level=info msg="RemovePodSandbox for \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\"" Apr 17 23:47:51.820682 containerd[1834]: time="2026-04-17T23:47:51.819753767Z" level=info msg="Forcibly stopping sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\"" Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.870 [WARNING][6748] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" WorkloadEndpoint="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--655df5d5f--wr76l-eth0" Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.871 [INFO][6748] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.871 [INFO][6748] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" iface="eth0" netns="" Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.871 [INFO][6748] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.871 [INFO][6748] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.891 [INFO][6755] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" HandleID="k8s-pod-network.e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--655df5d5f--wr76l-eth0" Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.891 [INFO][6755] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.891 [INFO][6755] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.907 [WARNING][6755] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" HandleID="k8s-pod-network.e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--655df5d5f--wr76l-eth0" Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.908 [INFO][6755] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" HandleID="k8s-pod-network.e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-whisker--655df5d5f--wr76l-eth0" Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.911 [INFO][6755] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:51.914822 containerd[1834]: 2026-04-17 23:47:51.912 [INFO][6748] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6" Apr 17 23:47:51.914822 containerd[1834]: time="2026-04-17T23:47:51.913477089Z" level=info msg="TearDown network for sandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\" successfully" Apr 17 23:47:51.922369 containerd[1834]: time="2026-04-17T23:47:51.922331328Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 17 23:47:51.922517 containerd[1834]: time="2026-04-17T23:47:51.922425130Z" level=info msg="RemovePodSandbox \"e8ac919b2d998feb57a6ee2afdd1dafcb9bc7e91ea70046104d604b330ba71f6\" returns successfully" Apr 17 23:47:51.923104 containerd[1834]: time="2026-04-17T23:47:51.923073548Z" level=info msg="StopPodSandbox for \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\"" Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.960 [WARNING][6769] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0607b5b7-e84f-4371-964e-db63a77e29d1", ResourceVersion:"1066", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 45, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650", Pod:"coredns-674b8bbfcf-n7zwr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3e07900ecbf", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.960 [INFO][6769] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.960 [INFO][6769] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" iface="eth0" netns="" Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.960 [INFO][6769] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.960 [INFO][6769] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.982 [INFO][6777] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" HandleID="k8s-pod-network.ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.982 [INFO][6777] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.982 [INFO][6777] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.990 [WARNING][6777] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" HandleID="k8s-pod-network.ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.990 [INFO][6777] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" HandleID="k8s-pod-network.ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.991 [INFO][6777] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:51.993834 containerd[1834]: 2026-04-17 23:47:51.992 [INFO][6769] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:47:51.994907 containerd[1834]: time="2026-04-17T23:47:51.993883453Z" level=info msg="TearDown network for sandbox \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\" successfully" Apr 17 23:47:51.994907 containerd[1834]: time="2026-04-17T23:47:51.993915154Z" level=info msg="StopPodSandbox for \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\" returns successfully" Apr 17 23:47:51.994907 containerd[1834]: time="2026-04-17T23:47:51.994488469Z" level=info msg="RemovePodSandbox for \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\"" Apr 17 23:47:51.994907 containerd[1834]: time="2026-04-17T23:47:51.994547671Z" level=info msg="Forcibly stopping sandbox \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\"" Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.029 [WARNING][6792] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"0607b5b7-e84f-4371-964e-db63a77e29d1", ResourceVersion:"1066", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 45, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"f7502ed510a935dd175170b6e61a722ab8b65dd9969607662e3ed0d3b9c43650", Pod:"coredns-674b8bbfcf-n7zwr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3e07900ecbf", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.029 [INFO][6792] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.029 [INFO][6792] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" iface="eth0" netns="" Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.029 [INFO][6792] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.029 [INFO][6792] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.050 [INFO][6799] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" HandleID="k8s-pod-network.ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.050 [INFO][6799] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.050 [INFO][6799] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.057 [WARNING][6799] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" HandleID="k8s-pod-network.ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.057 [INFO][6799] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" HandleID="k8s-pod-network.ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-coredns--674b8bbfcf--n7zwr-eth0" Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.059 [INFO][6799] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:52.061737 containerd[1834]: 2026-04-17 23:47:52.060 [INFO][6792] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3" Apr 17 23:47:52.061737 containerd[1834]: time="2026-04-17T23:47:52.061610976Z" level=info msg="TearDown network for sandbox \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\" successfully" Apr 17 23:47:52.072323 containerd[1834]: time="2026-04-17T23:47:52.072257062Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 17 23:47:52.072466 containerd[1834]: time="2026-04-17T23:47:52.072358565Z" level=info msg="RemovePodSandbox \"ed9a494bb1acc5e5490c723caeef10f8b25979c86b59887629e09cba1fa1a6a3\" returns successfully" Apr 17 23:47:52.072979 containerd[1834]: time="2026-04-17T23:47:52.072932380Z" level=info msg="StopPodSandbox for \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\"" Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.105 [WARNING][6814] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"4c2d4922-a2ca-483d-b7db-77418f884573", ResourceVersion:"1256", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d", Pod:"goldmane-5b85766d88-d6rv9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.71.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5288e1165cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.105 [INFO][6814] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.105 [INFO][6814] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" iface="eth0" netns="" Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.105 [INFO][6814] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.105 [INFO][6814] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.127 [INFO][6821] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" HandleID="k8s-pod-network.b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.127 [INFO][6821] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.127 [INFO][6821] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.133 [WARNING][6821] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" HandleID="k8s-pod-network.b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.133 [INFO][6821] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" HandleID="k8s-pod-network.b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.135 [INFO][6821] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:52.137813 containerd[1834]: 2026-04-17 23:47:52.136 [INFO][6814] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:47:52.138497 containerd[1834]: time="2026-04-17T23:47:52.137861628Z" level=info msg="TearDown network for sandbox \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\" successfully" Apr 17 23:47:52.138497 containerd[1834]: time="2026-04-17T23:47:52.137892628Z" level=info msg="StopPodSandbox for \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\" returns successfully" Apr 17 23:47:52.138593 containerd[1834]: time="2026-04-17T23:47:52.138541546Z" level=info msg="RemovePodSandbox for \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\"" Apr 17 23:47:52.138593 containerd[1834]: time="2026-04-17T23:47:52.138577047Z" level=info msg="Forcibly stopping sandbox \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\"" Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.174 [WARNING][6836] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"4c2d4922-a2ca-483d-b7db-77418f884573", ResourceVersion:"1256", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"361aac0d20bd0c295b71a8fbe3a13d98119670eb3c6182c0ca6e8ea61511338d", Pod:"goldmane-5b85766d88-d6rv9", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.71.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali5288e1165cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.174 [INFO][6836] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.174 [INFO][6836] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" iface="eth0" netns="" Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.174 [INFO][6836] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.174 [INFO][6836] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.196 [INFO][6843] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" HandleID="k8s-pod-network.b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.196 [INFO][6843] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.196 [INFO][6843] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.202 [WARNING][6843] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" HandleID="k8s-pod-network.b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.202 [INFO][6843] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" HandleID="k8s-pod-network.b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-goldmane--5b85766d88--d6rv9-eth0" Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.207 [INFO][6843] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:52.210777 containerd[1834]: 2026-04-17 23:47:52.209 [INFO][6836] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab" Apr 17 23:47:52.210777 containerd[1834]: time="2026-04-17T23:47:52.210712488Z" level=info msg="TearDown network for sandbox \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\" successfully" Apr 17 23:47:52.220591 containerd[1834]: time="2026-04-17T23:47:52.220530852Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 17 23:47:52.220738 containerd[1834]: time="2026-04-17T23:47:52.220622155Z" level=info msg="RemovePodSandbox \"b4ee55b141d3f52465cbf786e12266afe8e7204bc4458d564ffdf3772af058ab\" returns successfully" Apr 17 23:47:52.221168 containerd[1834]: time="2026-04-17T23:47:52.221129668Z" level=info msg="StopPodSandbox for \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\"" Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.255 [WARNING][6857] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0", GenerateName:"calico-apiserver-568984f78-", Namespace:"calico-system", SelfLink:"", UID:"8636a294-eef0-499d-a578-9b4ce7de9cb5", ResourceVersion:"1091", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"568984f78", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e", Pod:"calico-apiserver-568984f78-z9wh9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1bf13c3a279", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.255 [INFO][6857] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.255 [INFO][6857] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" iface="eth0" netns="" Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.255 [INFO][6857] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.255 [INFO][6857] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.277 [INFO][6865] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" HandleID="k8s-pod-network.900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.277 [INFO][6865] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.277 [INFO][6865] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.283 [WARNING][6865] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" HandleID="k8s-pod-network.900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.283 [INFO][6865] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" HandleID="k8s-pod-network.900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.285 [INFO][6865] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:52.287611 containerd[1834]: 2026-04-17 23:47:52.286 [INFO][6857] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:47:52.288336 containerd[1834]: time="2026-04-17T23:47:52.287657859Z" level=info msg="TearDown network for sandbox \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\" successfully" Apr 17 23:47:52.288336 containerd[1834]: time="2026-04-17T23:47:52.287687259Z" level=info msg="StopPodSandbox for \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\" returns successfully" Apr 17 23:47:52.288336 containerd[1834]: time="2026-04-17T23:47:52.288252775Z" level=info msg="RemovePodSandbox for \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\"" Apr 17 23:47:52.288336 containerd[1834]: time="2026-04-17T23:47:52.288304476Z" level=info msg="Forcibly stopping sandbox \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\"" Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.322 [WARNING][6880] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0", GenerateName:"calico-apiserver-568984f78-", Namespace:"calico-system", SelfLink:"", UID:"8636a294-eef0-499d-a578-9b4ce7de9cb5", ResourceVersion:"1091", Generation:0, CreationTimestamp:time.Date(2026, time.April, 17, 23, 46, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"568984f78", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081.3.6-n-7b570e9a3c", ContainerID:"20aacf146b6ce3011594b26d87e34caa39855eb963db2434494df845601c587e", Pod:"calico-apiserver-568984f78-z9wh9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1bf13c3a279", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.322 [INFO][6880] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.322 [INFO][6880] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" iface="eth0" netns="" Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.322 [INFO][6880] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.322 [INFO][6880] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.343 [INFO][6887] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" HandleID="k8s-pod-network.900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.343 [INFO][6887] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.343 [INFO][6887] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.351 [WARNING][6887] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" HandleID="k8s-pod-network.900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.351 [INFO][6887] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" HandleID="k8s-pod-network.900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Workload="ci--4081.3.6--n--7b570e9a3c-k8s-calico--apiserver--568984f78--z9wh9-eth0" Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.353 [INFO][6887] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 17 23:47:52.355593 containerd[1834]: 2026-04-17 23:47:52.354 [INFO][6880] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae" Apr 17 23:47:52.356259 containerd[1834]: time="2026-04-17T23:47:52.355646688Z" level=info msg="TearDown network for sandbox \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\" successfully" Apr 17 23:47:52.363334 containerd[1834]: time="2026-04-17T23:47:52.363278593Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 17 23:47:52.363513 containerd[1834]: time="2026-04-17T23:47:52.363375996Z" level=info msg="RemovePodSandbox \"900076fec7103fe8baff240fe6ba81cd837129e30ccf09da414a570ede8274ae\" returns successfully" Apr 17 23:47:56.018879 systemd[1]: Started sshd@11-10.0.0.10:22-20.229.252.112:55308.service - OpenSSH per-connection server daemon (20.229.252.112:55308). Apr 17 23:47:56.132710 sshd[6914]: Accepted publickey for core from 20.229.252.112 port 55308 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:47:56.134183 sshd[6914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:47:56.139086 systemd-logind[1800]: New session 14 of user core. Apr 17 23:47:56.146810 systemd[1]: Started session-14.scope - Session 14 of User core. Apr 17 23:47:56.306411 sshd[6914]: pam_unix(sshd:session): session closed for user core Apr 17 23:47:56.309709 systemd[1]: sshd@11-10.0.0.10:22-20.229.252.112:55308.service: Deactivated successfully. Apr 17 23:47:56.315593 systemd-logind[1800]: Session 14 logged out. Waiting for processes to exit. Apr 17 23:47:56.316229 systemd[1]: session-14.scope: Deactivated successfully. Apr 17 23:47:56.317733 systemd-logind[1800]: Removed session 14. Apr 17 23:48:01.331091 systemd[1]: Started sshd@12-10.0.0.10:22-20.229.252.112:55318.service - OpenSSH per-connection server daemon (20.229.252.112:55318). Apr 17 23:48:01.446392 sshd[6928]: Accepted publickey for core from 20.229.252.112 port 55318 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:01.447921 sshd[6928]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:01.452377 systemd-logind[1800]: New session 15 of user core. Apr 17 23:48:01.459144 systemd[1]: Started session-15.scope - Session 15 of User core. Apr 17 23:48:01.625694 sshd[6928]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:01.628786 systemd[1]: sshd@12-10.0.0.10:22-20.229.252.112:55318.service: Deactivated successfully. Apr 17 23:48:01.634705 systemd-logind[1800]: Session 15 logged out. Waiting for processes to exit. Apr 17 23:48:01.635399 systemd[1]: session-15.scope: Deactivated successfully. Apr 17 23:48:01.636591 systemd-logind[1800]: Removed session 15. Apr 17 23:48:06.648793 systemd[1]: Started sshd@13-10.0.0.10:22-20.229.252.112:48570.service - OpenSSH per-connection server daemon (20.229.252.112:48570). Apr 17 23:48:06.762032 sshd[6963]: Accepted publickey for core from 20.229.252.112 port 48570 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:06.763647 sshd[6963]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:06.768330 systemd-logind[1800]: New session 16 of user core. Apr 17 23:48:06.777870 systemd[1]: Started session-16.scope - Session 16 of User core. Apr 17 23:48:06.938090 sshd[6963]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:06.943011 systemd[1]: sshd@13-10.0.0.10:22-20.229.252.112:48570.service: Deactivated successfully. Apr 17 23:48:06.948013 systemd[1]: session-16.scope: Deactivated successfully. Apr 17 23:48:06.948969 systemd-logind[1800]: Session 16 logged out. Waiting for processes to exit. Apr 17 23:48:06.950068 systemd-logind[1800]: Removed session 16. Apr 17 23:48:11.961819 systemd[1]: Started sshd@14-10.0.0.10:22-20.229.252.112:48586.service - OpenSSH per-connection server daemon (20.229.252.112:48586). Apr 17 23:48:12.085032 sshd[6978]: Accepted publickey for core from 20.229.252.112 port 48586 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:12.085748 sshd[6978]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:12.090786 systemd-logind[1800]: New session 17 of user core. Apr 17 23:48:12.097125 systemd[1]: Started session-17.scope - Session 17 of User core. Apr 17 23:48:12.259911 sshd[6978]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:12.263672 systemd[1]: sshd@14-10.0.0.10:22-20.229.252.112:48586.service: Deactivated successfully. Apr 17 23:48:12.269915 systemd-logind[1800]: Session 17 logged out. Waiting for processes to exit. Apr 17 23:48:12.270379 systemd[1]: session-17.scope: Deactivated successfully. Apr 17 23:48:12.273953 systemd-logind[1800]: Removed session 17. Apr 17 23:48:17.288413 systemd[1]: Started sshd@15-10.0.0.10:22-20.229.252.112:45358.service - OpenSSH per-connection server daemon (20.229.252.112:45358). Apr 17 23:48:17.401382 sshd[7021]: Accepted publickey for core from 20.229.252.112 port 45358 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:17.402945 sshd[7021]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:17.407390 systemd-logind[1800]: New session 18 of user core. Apr 17 23:48:17.414154 systemd[1]: Started session-18.scope - Session 18 of User core. Apr 17 23:48:17.571177 sshd[7021]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:17.576308 systemd[1]: sshd@15-10.0.0.10:22-20.229.252.112:45358.service: Deactivated successfully. Apr 17 23:48:17.580757 systemd[1]: session-18.scope: Deactivated successfully. Apr 17 23:48:17.581910 systemd-logind[1800]: Session 18 logged out. Waiting for processes to exit. Apr 17 23:48:17.583039 systemd-logind[1800]: Removed session 18. Apr 17 23:48:17.593982 systemd[1]: Started sshd@16-10.0.0.10:22-20.229.252.112:45368.service - OpenSSH per-connection server daemon (20.229.252.112:45368). Apr 17 23:48:17.707910 sshd[7036]: Accepted publickey for core from 20.229.252.112 port 45368 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:17.709721 sshd[7036]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:17.714476 systemd-logind[1800]: New session 19 of user core. Apr 17 23:48:17.716777 systemd[1]: Started session-19.scope - Session 19 of User core. Apr 17 23:48:17.906724 sshd[7036]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:17.915851 systemd[1]: sshd@16-10.0.0.10:22-20.229.252.112:45368.service: Deactivated successfully. Apr 17 23:48:17.928327 systemd[1]: session-19.scope: Deactivated successfully. Apr 17 23:48:17.929806 systemd-logind[1800]: Session 19 logged out. Waiting for processes to exit. Apr 17 23:48:17.935883 systemd[1]: Started sshd@17-10.0.0.10:22-20.229.252.112:45372.service - OpenSSH per-connection server daemon (20.229.252.112:45372). Apr 17 23:48:17.937695 systemd-logind[1800]: Removed session 19. Apr 17 23:48:18.057365 sshd[7048]: Accepted publickey for core from 20.229.252.112 port 45372 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:18.059005 sshd[7048]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:18.064272 systemd-logind[1800]: New session 20 of user core. Apr 17 23:48:18.066773 systemd[1]: Started session-20.scope - Session 20 of User core. Apr 17 23:48:18.227419 sshd[7048]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:18.232620 systemd[1]: sshd@17-10.0.0.10:22-20.229.252.112:45372.service: Deactivated successfully. Apr 17 23:48:18.236437 systemd[1]: session-20.scope: Deactivated successfully. Apr 17 23:48:18.237985 systemd-logind[1800]: Session 20 logged out. Waiting for processes to exit. Apr 17 23:48:18.239067 systemd-logind[1800]: Removed session 20. Apr 17 23:48:23.256230 systemd[1]: Started sshd@18-10.0.0.10:22-20.229.252.112:45376.service - OpenSSH per-connection server daemon (20.229.252.112:45376). Apr 17 23:48:23.373184 sshd[7064]: Accepted publickey for core from 20.229.252.112 port 45376 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:23.374824 sshd[7064]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:23.379772 systemd-logind[1800]: New session 21 of user core. Apr 17 23:48:23.386774 systemd[1]: Started session-21.scope - Session 21 of User core. Apr 17 23:48:23.549867 sshd[7064]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:23.555707 systemd[1]: sshd@18-10.0.0.10:22-20.229.252.112:45376.service: Deactivated successfully. Apr 17 23:48:23.559781 systemd[1]: session-21.scope: Deactivated successfully. Apr 17 23:48:23.560814 systemd-logind[1800]: Session 21 logged out. Waiting for processes to exit. Apr 17 23:48:23.561895 systemd-logind[1800]: Removed session 21. Apr 17 23:48:28.574196 systemd[1]: Started sshd@19-10.0.0.10:22-20.229.252.112:34478.service - OpenSSH per-connection server daemon (20.229.252.112:34478). Apr 17 23:48:28.689447 sshd[7121]: Accepted publickey for core from 20.229.252.112 port 34478 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:28.690523 sshd[7121]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:28.694544 systemd-logind[1800]: New session 22 of user core. Apr 17 23:48:28.697755 systemd[1]: Started session-22.scope - Session 22 of User core. Apr 17 23:48:28.868269 sshd[7121]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:28.871760 systemd[1]: sshd@19-10.0.0.10:22-20.229.252.112:34478.service: Deactivated successfully. Apr 17 23:48:28.877582 systemd[1]: session-22.scope: Deactivated successfully. Apr 17 23:48:28.878807 systemd-logind[1800]: Session 22 logged out. Waiting for processes to exit. Apr 17 23:48:28.879812 systemd-logind[1800]: Removed session 22. Apr 17 23:48:33.894811 systemd[1]: Started sshd@20-10.0.0.10:22-20.229.252.112:34494.service - OpenSSH per-connection server daemon (20.229.252.112:34494). Apr 17 23:48:34.007724 sshd[7176]: Accepted publickey for core from 20.229.252.112 port 34494 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:34.009269 sshd[7176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:34.013966 systemd-logind[1800]: New session 23 of user core. Apr 17 23:48:34.017249 systemd[1]: Started session-23.scope - Session 23 of User core. Apr 17 23:48:34.178507 sshd[7176]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:34.183941 systemd[1]: sshd@20-10.0.0.10:22-20.229.252.112:34494.service: Deactivated successfully. Apr 17 23:48:34.187843 systemd[1]: session-23.scope: Deactivated successfully. Apr 17 23:48:34.188839 systemd-logind[1800]: Session 23 logged out. Waiting for processes to exit. Apr 17 23:48:34.190038 systemd-logind[1800]: Removed session 23. Apr 17 23:48:39.202378 systemd[1]: Started sshd@21-10.0.0.10:22-20.229.252.112:45136.service - OpenSSH per-connection server daemon (20.229.252.112:45136). Apr 17 23:48:39.318749 sshd[7202]: Accepted publickey for core from 20.229.252.112 port 45136 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:39.320278 sshd[7202]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:39.324542 systemd-logind[1800]: New session 24 of user core. Apr 17 23:48:39.333062 systemd[1]: Started session-24.scope - Session 24 of User core. Apr 17 23:48:39.500180 sshd[7202]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:39.507173 systemd[1]: sshd@21-10.0.0.10:22-20.229.252.112:45136.service: Deactivated successfully. Apr 17 23:48:39.511235 systemd[1]: session-24.scope: Deactivated successfully. Apr 17 23:48:39.512602 systemd-logind[1800]: Session 24 logged out. Waiting for processes to exit. Apr 17 23:48:39.513612 systemd-logind[1800]: Removed session 24. Apr 17 23:48:39.523115 systemd[1]: Started sshd@22-10.0.0.10:22-20.229.252.112:45144.service - OpenSSH per-connection server daemon (20.229.252.112:45144). Apr 17 23:48:39.639269 sshd[7216]: Accepted publickey for core from 20.229.252.112 port 45144 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:39.639930 sshd[7216]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:39.644647 systemd-logind[1800]: New session 25 of user core. Apr 17 23:48:39.648911 systemd[1]: Started session-25.scope - Session 25 of User core. Apr 17 23:48:39.871621 sshd[7216]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:39.875562 systemd[1]: sshd@22-10.0.0.10:22-20.229.252.112:45144.service: Deactivated successfully. Apr 17 23:48:39.881138 systemd-logind[1800]: Session 25 logged out. Waiting for processes to exit. Apr 17 23:48:39.881804 systemd[1]: session-25.scope: Deactivated successfully. Apr 17 23:48:39.883500 systemd-logind[1800]: Removed session 25. Apr 17 23:48:39.893808 systemd[1]: Started sshd@23-10.0.0.10:22-20.229.252.112:45154.service - OpenSSH per-connection server daemon (20.229.252.112:45154). Apr 17 23:48:40.010546 sshd[7228]: Accepted publickey for core from 20.229.252.112 port 45154 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:40.011221 sshd[7228]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:40.015540 systemd-logind[1800]: New session 26 of user core. Apr 17 23:48:40.020169 systemd[1]: Started session-26.scope - Session 26 of User core. Apr 17 23:48:40.755419 sshd[7228]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:40.765062 systemd[1]: sshd@23-10.0.0.10:22-20.229.252.112:45154.service: Deactivated successfully. Apr 17 23:48:40.775739 systemd-logind[1800]: Session 26 logged out. Waiting for processes to exit. Apr 17 23:48:40.781472 systemd[1]: session-26.scope: Deactivated successfully. Apr 17 23:48:40.790878 systemd[1]: Started sshd@24-10.0.0.10:22-20.229.252.112:45162.service - OpenSSH per-connection server daemon (20.229.252.112:45162). Apr 17 23:48:40.792062 systemd-logind[1800]: Removed session 26. Apr 17 23:48:40.907858 sshd[7255]: Accepted publickey for core from 20.229.252.112 port 45162 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:40.909397 sshd[7255]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:40.913540 systemd-logind[1800]: New session 27 of user core. Apr 17 23:48:40.923766 systemd[1]: Started session-27.scope - Session 27 of User core. Apr 17 23:48:41.201875 sshd[7255]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:41.209292 systemd[1]: sshd@24-10.0.0.10:22-20.229.252.112:45162.service: Deactivated successfully. Apr 17 23:48:41.214510 systemd[1]: session-27.scope: Deactivated successfully. Apr 17 23:48:41.216086 systemd-logind[1800]: Session 27 logged out. Waiting for processes to exit. Apr 17 23:48:41.227813 systemd[1]: Started sshd@25-10.0.0.10:22-20.229.252.112:45172.service - OpenSSH per-connection server daemon (20.229.252.112:45172). Apr 17 23:48:41.229833 systemd-logind[1800]: Removed session 27. Apr 17 23:48:41.346540 sshd[7267]: Accepted publickey for core from 20.229.252.112 port 45172 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:41.348102 sshd[7267]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:41.352538 systemd-logind[1800]: New session 28 of user core. Apr 17 23:48:41.358975 systemd[1]: Started session-28.scope - Session 28 of User core. Apr 17 23:48:41.521083 sshd[7267]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:41.524373 systemd[1]: sshd@25-10.0.0.10:22-20.229.252.112:45172.service: Deactivated successfully. Apr 17 23:48:41.530013 systemd-logind[1800]: Session 28 logged out. Waiting for processes to exit. Apr 17 23:48:41.530607 systemd[1]: session-28.scope: Deactivated successfully. Apr 17 23:48:41.532238 systemd-logind[1800]: Removed session 28. Apr 17 23:48:46.544161 systemd[1]: Started sshd@26-10.0.0.10:22-20.229.252.112:40358.service - OpenSSH per-connection server daemon (20.229.252.112:40358). Apr 17 23:48:46.658519 sshd[7303]: Accepted publickey for core from 20.229.252.112 port 40358 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:46.660070 sshd[7303]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:46.664842 systemd-logind[1800]: New session 29 of user core. Apr 17 23:48:46.668736 systemd[1]: Started session-29.scope - Session 29 of User core. Apr 17 23:48:46.828349 sshd[7303]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:46.831880 systemd[1]: sshd@26-10.0.0.10:22-20.229.252.112:40358.service: Deactivated successfully. Apr 17 23:48:46.836952 systemd[1]: session-29.scope: Deactivated successfully. Apr 17 23:48:46.838996 systemd-logind[1800]: Session 29 logged out. Waiting for processes to exit. Apr 17 23:48:46.839969 systemd-logind[1800]: Removed session 29. Apr 17 23:48:48.260141 update_engine[1805]: I20260417 23:48:48.260082 1805 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Apr 17 23:48:48.260141 update_engine[1805]: I20260417 23:48:48.260135 1805 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Apr 17 23:48:48.260759 update_engine[1805]: I20260417 23:48:48.260358 1805 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Apr 17 23:48:48.260939 update_engine[1805]: I20260417 23:48:48.260902 1805 omaha_request_params.cc:62] Current group set to lts Apr 17 23:48:48.261199 update_engine[1805]: I20260417 23:48:48.261049 1805 update_attempter.cc:499] Already updated boot flags. Skipping. Apr 17 23:48:48.261199 update_engine[1805]: I20260417 23:48:48.261070 1805 update_attempter.cc:643] Scheduling an action processor start. Apr 17 23:48:48.261199 update_engine[1805]: I20260417 23:48:48.261089 1805 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Apr 17 23:48:48.261199 update_engine[1805]: I20260417 23:48:48.261126 1805 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Apr 17 23:48:48.261392 update_engine[1805]: I20260417 23:48:48.261212 1805 omaha_request_action.cc:271] Posting an Omaha request to disabled Apr 17 23:48:48.261392 update_engine[1805]: I20260417 23:48:48.261225 1805 omaha_request_action.cc:272] Request: Apr 17 23:48:48.261392 update_engine[1805]: Apr 17 23:48:48.261392 update_engine[1805]: Apr 17 23:48:48.261392 update_engine[1805]: Apr 17 23:48:48.261392 update_engine[1805]: Apr 17 23:48:48.261392 update_engine[1805]: Apr 17 23:48:48.261392 update_engine[1805]: Apr 17 23:48:48.261392 update_engine[1805]: Apr 17 23:48:48.261392 update_engine[1805]: Apr 17 23:48:48.261392 update_engine[1805]: I20260417 23:48:48.261234 1805 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 17 23:48:48.262060 locksmithd[1872]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Apr 17 23:48:48.263043 update_engine[1805]: I20260417 23:48:48.263010 1805 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 17 23:48:48.263368 update_engine[1805]: I20260417 23:48:48.263334 1805 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 17 23:48:48.284932 update_engine[1805]: E20260417 23:48:48.284881 1805 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 17 23:48:48.285043 update_engine[1805]: I20260417 23:48:48.284981 1805 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Apr 17 23:48:51.853108 systemd[1]: Started sshd@27-10.0.0.10:22-20.229.252.112:40362.service - OpenSSH per-connection server daemon (20.229.252.112:40362). Apr 17 23:48:51.968035 sshd[7360]: Accepted publickey for core from 20.229.252.112 port 40362 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:51.969687 sshd[7360]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:51.977829 systemd-logind[1800]: New session 30 of user core. Apr 17 23:48:51.985285 systemd[1]: Started session-30.scope - Session 30 of User core. Apr 17 23:48:52.144970 sshd[7360]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:52.148597 systemd[1]: sshd@27-10.0.0.10:22-20.229.252.112:40362.service: Deactivated successfully. Apr 17 23:48:52.153916 systemd[1]: session-30.scope: Deactivated successfully. Apr 17 23:48:52.155973 systemd-logind[1800]: Session 30 logged out. Waiting for processes to exit. Apr 17 23:48:52.157119 systemd-logind[1800]: Removed session 30. Apr 17 23:48:54.185220 systemd[1]: run-containerd-runc-k8s.io-08aac514bf7f80b4095867869b4608810d89a76cb0fe897f0421c363f5d91ba3-runc.EJuzeQ.mount: Deactivated successfully. Apr 17 23:48:57.168807 systemd[1]: Started sshd@28-10.0.0.10:22-20.229.252.112:56990.service - OpenSSH per-connection server daemon (20.229.252.112:56990). Apr 17 23:48:57.284938 sshd[7396]: Accepted publickey for core from 20.229.252.112 port 56990 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:48:57.286428 sshd[7396]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:48:57.291871 systemd-logind[1800]: New session 31 of user core. Apr 17 23:48:57.297841 systemd[1]: Started session-31.scope - Session 31 of User core. Apr 17 23:48:57.460743 sshd[7396]: pam_unix(sshd:session): session closed for user core Apr 17 23:48:57.464824 systemd[1]: sshd@28-10.0.0.10:22-20.229.252.112:56990.service: Deactivated successfully. Apr 17 23:48:57.469603 systemd-logind[1800]: Session 31 logged out. Waiting for processes to exit. Apr 17 23:48:57.470390 systemd[1]: session-31.scope: Deactivated successfully. Apr 17 23:48:57.472586 systemd-logind[1800]: Removed session 31. Apr 17 23:48:58.260669 update_engine[1805]: I20260417 23:48:58.260584 1805 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 17 23:48:58.261222 update_engine[1805]: I20260417 23:48:58.260890 1805 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 17 23:48:58.261222 update_engine[1805]: I20260417 23:48:58.261202 1805 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 17 23:48:58.282710 update_engine[1805]: E20260417 23:48:58.282635 1805 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 17 23:48:58.282857 update_engine[1805]: I20260417 23:48:58.282737 1805 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Apr 17 23:49:02.482780 systemd[1]: Started sshd@29-10.0.0.10:22-20.229.252.112:57004.service - OpenSSH per-connection server daemon (20.229.252.112:57004). Apr 17 23:49:02.599862 sshd[7411]: Accepted publickey for core from 20.229.252.112 port 57004 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:49:02.600471 sshd[7411]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:49:02.606512 systemd-logind[1800]: New session 32 of user core. Apr 17 23:49:02.612189 systemd[1]: Started session-32.scope - Session 32 of User core. Apr 17 23:49:02.768382 sshd[7411]: pam_unix(sshd:session): session closed for user core Apr 17 23:49:02.772654 systemd[1]: sshd@29-10.0.0.10:22-20.229.252.112:57004.service: Deactivated successfully. Apr 17 23:49:02.777283 systemd[1]: session-32.scope: Deactivated successfully. Apr 17 23:49:02.778377 systemd-logind[1800]: Session 32 logged out. Waiting for processes to exit. Apr 17 23:49:02.779704 systemd-logind[1800]: Removed session 32. Apr 17 23:49:07.791777 systemd[1]: Started sshd@30-10.0.0.10:22-20.229.252.112:51990.service - OpenSSH per-connection server daemon (20.229.252.112:51990). Apr 17 23:49:07.906772 sshd[7447]: Accepted publickey for core from 20.229.252.112 port 51990 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:49:07.907347 sshd[7447]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:49:07.911699 systemd-logind[1800]: New session 33 of user core. Apr 17 23:49:07.916894 systemd[1]: Started session-33.scope - Session 33 of User core. Apr 17 23:49:08.073408 sshd[7447]: pam_unix(sshd:session): session closed for user core Apr 17 23:49:08.078851 systemd[1]: sshd@30-10.0.0.10:22-20.229.252.112:51990.service: Deactivated successfully. Apr 17 23:49:08.083463 systemd[1]: session-33.scope: Deactivated successfully. Apr 17 23:49:08.084359 systemd-logind[1800]: Session 33 logged out. Waiting for processes to exit. Apr 17 23:49:08.085880 systemd-logind[1800]: Removed session 33. Apr 17 23:49:08.260606 update_engine[1805]: I20260417 23:49:08.260526 1805 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 17 23:49:08.261130 update_engine[1805]: I20260417 23:49:08.260824 1805 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 17 23:49:08.261130 update_engine[1805]: I20260417 23:49:08.261078 1805 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 17 23:49:08.277111 update_engine[1805]: E20260417 23:49:08.277035 1805 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 17 23:49:08.277267 update_engine[1805]: I20260417 23:49:08.277153 1805 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Apr 17 23:49:13.097138 systemd[1]: Started sshd@31-10.0.0.10:22-20.229.252.112:51998.service - OpenSSH per-connection server daemon (20.229.252.112:51998). Apr 17 23:49:13.212598 sshd[7480]: Accepted publickey for core from 20.229.252.112 port 51998 ssh2: RSA SHA256:sMtqA11TjzQIJRJw4PihEx7btYqNmsZRNCIArhmId48 Apr 17 23:49:13.214110 sshd[7480]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 17 23:49:13.218577 systemd-logind[1800]: New session 34 of user core. Apr 17 23:49:13.224760 systemd[1]: Started session-34.scope - Session 34 of User core. Apr 17 23:49:13.386265 sshd[7480]: pam_unix(sshd:session): session closed for user core Apr 17 23:49:13.391197 systemd[1]: sshd@31-10.0.0.10:22-20.229.252.112:51998.service: Deactivated successfully. Apr 17 23:49:13.396519 systemd[1]: session-34.scope: Deactivated successfully. Apr 17 23:49:13.397513 systemd-logind[1800]: Session 34 logged out. Waiting for processes to exit. Apr 17 23:49:13.398575 systemd-logind[1800]: Removed session 34.