Apr 24 23:36:08.985340 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Apr 24 22:11:38 -00 2026 Apr 24 23:36:08.985358 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=c8442747465ed99a522e07b8746f6a7817fb39c2025d7438698e3b90e9c0defb Apr 24 23:36:08.985367 kernel: BIOS-provided physical RAM map: Apr 24 23:36:08.985372 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Apr 24 23:36:08.985376 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Apr 24 23:36:08.985381 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Apr 24 23:36:08.985386 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Apr 24 23:36:08.985390 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007f9ecfff] reserved Apr 24 23:36:08.985395 kernel: BIOS-e820: [mem 0x000000007f9ed000-0x000000007faecfff] type 20 Apr 24 23:36:08.985399 kernel: BIOS-e820: [mem 0x000000007faed000-0x000000007fb6cfff] reserved Apr 24 23:36:08.985403 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Apr 24 23:36:08.985423 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Apr 24 23:36:08.985427 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Apr 24 23:36:08.985432 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Apr 24 23:36:08.985437 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Apr 24 23:36:08.985442 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Apr 24 23:36:08.985449 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Apr 24 23:36:08.985453 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Apr 24 23:36:08.985458 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Apr 24 23:36:08.985463 kernel: NX (Execute Disable) protection: active Apr 24 23:36:08.985467 kernel: APIC: Static calls initialized Apr 24 23:36:08.985472 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Apr 24 23:36:08.985476 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e01b198 Apr 24 23:36:08.985481 kernel: efi: Remove mem135: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Apr 24 23:36:08.985486 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Apr 24 23:36:08.985491 kernel: SMBIOS 3.0.0 present. Apr 24 23:36:08.985495 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Apr 24 23:36:08.985500 kernel: Hypervisor detected: KVM Apr 24 23:36:08.985507 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Apr 24 23:36:08.985512 kernel: kvm-clock: using sched offset of 12783901547 cycles Apr 24 23:36:08.985524 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Apr 24 23:36:08.985540 kernel: tsc: Detected 2399.998 MHz processor Apr 24 23:36:08.985555 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Apr 24 23:36:08.985570 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Apr 24 23:36:08.985585 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Apr 24 23:36:08.985600 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Apr 24 23:36:08.985615 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Apr 24 23:36:08.985638 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Apr 24 23:36:08.985653 kernel: Using GB pages for direct mapping Apr 24 23:36:08.985658 kernel: Secure boot disabled Apr 24 23:36:08.985666 kernel: ACPI: Early table checksum verification disabled Apr 24 23:36:08.985671 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Apr 24 23:36:08.985676 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Apr 24 23:36:08.985681 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:36:08.985689 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:36:08.985694 kernel: ACPI: FACS 0x000000007FBDD000 000040 Apr 24 23:36:08.985698 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:36:08.985703 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:36:08.985708 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:36:08.985713 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:36:08.985719 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Apr 24 23:36:08.985726 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Apr 24 23:36:08.985731 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Apr 24 23:36:08.985736 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Apr 24 23:36:08.985741 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Apr 24 23:36:08.985746 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Apr 24 23:36:08.985751 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Apr 24 23:36:08.985756 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Apr 24 23:36:08.985761 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Apr 24 23:36:08.985766 kernel: No NUMA configuration found Apr 24 23:36:08.985773 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Apr 24 23:36:08.985779 kernel: NODE_DATA(0) allocated [mem 0x179ff8000-0x179ffdfff] Apr 24 23:36:08.985784 kernel: Zone ranges: Apr 24 23:36:08.985789 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Apr 24 23:36:08.985793 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Apr 24 23:36:08.985798 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Apr 24 23:36:08.985803 kernel: Movable zone start for each node Apr 24 23:36:08.985814 kernel: Early memory node ranges Apr 24 23:36:08.985825 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Apr 24 23:36:08.985835 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Apr 24 23:36:08.985843 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Apr 24 23:36:08.985847 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Apr 24 23:36:08.985852 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Apr 24 23:36:08.985857 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Apr 24 23:36:08.985862 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Apr 24 23:36:08.985867 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Apr 24 23:36:08.985872 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Apr 24 23:36:08.985877 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Apr 24 23:36:08.985882 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Apr 24 23:36:08.985890 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Apr 24 23:36:08.985895 kernel: ACPI: PM-Timer IO Port: 0x608 Apr 24 23:36:08.985900 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Apr 24 23:36:08.985905 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Apr 24 23:36:08.985910 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Apr 24 23:36:08.985914 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Apr 24 23:36:08.985919 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Apr 24 23:36:08.985924 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Apr 24 23:36:08.985929 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Apr 24 23:36:08.985936 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Apr 24 23:36:08.985941 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Apr 24 23:36:08.985946 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Apr 24 23:36:08.985951 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Apr 24 23:36:08.985956 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Apr 24 23:36:08.985961 kernel: Booting paravirtualized kernel on KVM Apr 24 23:36:08.985967 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Apr 24 23:36:08.985972 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Apr 24 23:36:08.985977 kernel: percpu: Embedded 57 pages/cpu s196328 r8192 d28952 u1048576 Apr 24 23:36:08.985985 kernel: pcpu-alloc: s196328 r8192 d28952 u1048576 alloc=1*2097152 Apr 24 23:36:08.985990 kernel: pcpu-alloc: [0] 0 1 Apr 24 23:36:08.985995 kernel: kvm-guest: PV spinlocks disabled, no host support Apr 24 23:36:08.986000 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=c8442747465ed99a522e07b8746f6a7817fb39c2025d7438698e3b90e9c0defb Apr 24 23:36:08.986006 kernel: random: crng init done Apr 24 23:36:08.986011 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Apr 24 23:36:08.986016 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Apr 24 23:36:08.986020 kernel: Fallback order for Node 0: 0 Apr 24 23:36:08.986025 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1004632 Apr 24 23:36:08.986033 kernel: Policy zone: Normal Apr 24 23:36:08.986038 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Apr 24 23:36:08.986043 kernel: software IO TLB: area num 2. Apr 24 23:36:08.986048 kernel: Memory: 3827764K/4091168K available (12288K kernel code, 2288K rwdata, 22752K rodata, 42896K init, 2300K bss, 263200K reserved, 0K cma-reserved) Apr 24 23:36:08.986053 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Apr 24 23:36:08.986058 kernel: ftrace: allocating 37996 entries in 149 pages Apr 24 23:36:08.986063 kernel: ftrace: allocated 149 pages with 4 groups Apr 24 23:36:08.986068 kernel: Dynamic Preempt: voluntary Apr 24 23:36:08.986073 kernel: rcu: Preemptible hierarchical RCU implementation. Apr 24 23:36:08.986081 kernel: rcu: RCU event tracing is enabled. Apr 24 23:36:08.986086 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Apr 24 23:36:08.986092 kernel: Trampoline variant of Tasks RCU enabled. Apr 24 23:36:08.986104 kernel: Rude variant of Tasks RCU enabled. Apr 24 23:36:08.986111 kernel: Tracing variant of Tasks RCU enabled. Apr 24 23:36:08.986116 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Apr 24 23:36:08.986122 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Apr 24 23:36:08.986127 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Apr 24 23:36:08.986132 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Apr 24 23:36:08.986137 kernel: Console: colour dummy device 80x25 Apr 24 23:36:08.986142 kernel: printk: console [tty0] enabled Apr 24 23:36:08.986148 kernel: printk: console [ttyS0] enabled Apr 24 23:36:08.986155 kernel: ACPI: Core revision 20230628 Apr 24 23:36:08.986160 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Apr 24 23:36:08.986166 kernel: APIC: Switch to symmetric I/O mode setup Apr 24 23:36:08.986171 kernel: x2apic enabled Apr 24 23:36:08.986176 kernel: APIC: Switched APIC routing to: physical x2apic Apr 24 23:36:08.986183 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Apr 24 23:36:08.986188 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Apr 24 23:36:08.986194 kernel: Calibrating delay loop (skipped) preset value.. 4799.99 BogoMIPS (lpj=2399998) Apr 24 23:36:08.986199 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Apr 24 23:36:08.986204 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Apr 24 23:36:08.986209 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Apr 24 23:36:08.986214 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Apr 24 23:36:08.986220 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Apr 24 23:36:08.986225 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Apr 24 23:36:08.986233 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Apr 24 23:36:08.986238 kernel: active return thunk: srso_alias_return_thunk Apr 24 23:36:08.986243 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Apr 24 23:36:08.986248 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Apr 24 23:36:08.986253 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Apr 24 23:36:08.986259 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Apr 24 23:36:08.986264 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Apr 24 23:36:08.986269 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Apr 24 23:36:08.986277 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Apr 24 23:36:08.986282 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Apr 24 23:36:08.986287 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Apr 24 23:36:08.986292 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Apr 24 23:36:08.986297 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Apr 24 23:36:08.986302 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Apr 24 23:36:08.986307 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Apr 24 23:36:08.986313 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Apr 24 23:36:08.986318 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Apr 24 23:36:08.986326 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Apr 24 23:36:08.986331 kernel: Freeing SMP alternatives memory: 32K Apr 24 23:36:08.986336 kernel: pid_max: default: 32768 minimum: 301 Apr 24 23:36:08.986341 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Apr 24 23:36:08.986347 kernel: landlock: Up and running. Apr 24 23:36:08.986352 kernel: SELinux: Initializing. Apr 24 23:36:08.986357 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Apr 24 23:36:08.986362 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Apr 24 23:36:08.986367 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Apr 24 23:36:08.986375 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 24 23:36:08.986380 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 24 23:36:08.986385 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 24 23:36:08.986390 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Apr 24 23:36:08.986396 kernel: ... version: 0 Apr 24 23:36:08.986401 kernel: ... bit width: 48 Apr 24 23:36:08.986825 kernel: ... generic registers: 6 Apr 24 23:36:08.986835 kernel: ... value mask: 0000ffffffffffff Apr 24 23:36:08.986841 kernel: ... max period: 00007fffffffffff Apr 24 23:36:08.986850 kernel: ... fixed-purpose events: 0 Apr 24 23:36:08.986856 kernel: ... event mask: 000000000000003f Apr 24 23:36:08.986861 kernel: signal: max sigframe size: 3376 Apr 24 23:36:08.986866 kernel: rcu: Hierarchical SRCU implementation. Apr 24 23:36:08.986872 kernel: rcu: Max phase no-delay instances is 400. Apr 24 23:36:08.986877 kernel: smp: Bringing up secondary CPUs ... Apr 24 23:36:08.986883 kernel: smpboot: x86: Booting SMP configuration: Apr 24 23:36:08.986888 kernel: .... node #0, CPUs: #1 Apr 24 23:36:08.986893 kernel: smp: Brought up 1 node, 2 CPUs Apr 24 23:36:08.986901 kernel: smpboot: Max logical packages: 1 Apr 24 23:36:08.986906 kernel: smpboot: Total of 2 processors activated (9599.99 BogoMIPS) Apr 24 23:36:08.986912 kernel: devtmpfs: initialized Apr 24 23:36:08.986917 kernel: x86/mm: Memory block size: 128MB Apr 24 23:36:08.986922 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Apr 24 23:36:08.986928 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Apr 24 23:36:08.986933 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Apr 24 23:36:08.986938 kernel: pinctrl core: initialized pinctrl subsystem Apr 24 23:36:08.986943 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Apr 24 23:36:08.986951 kernel: audit: initializing netlink subsys (disabled) Apr 24 23:36:08.986956 kernel: audit: type=2000 audit(1777073767.418:1): state=initialized audit_enabled=0 res=1 Apr 24 23:36:08.986961 kernel: thermal_sys: Registered thermal governor 'step_wise' Apr 24 23:36:08.986966 kernel: thermal_sys: Registered thermal governor 'user_space' Apr 24 23:36:08.986972 kernel: cpuidle: using governor menu Apr 24 23:36:08.986977 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Apr 24 23:36:08.986982 kernel: dca service started, version 1.12.1 Apr 24 23:36:08.986987 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xe0000000-0xefffffff] (base 0xe0000000) Apr 24 23:36:08.986993 kernel: PCI: Using configuration type 1 for base access Apr 24 23:36:08.987000 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Apr 24 23:36:08.987005 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Apr 24 23:36:08.987011 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Apr 24 23:36:08.987016 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Apr 24 23:36:08.987021 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Apr 24 23:36:08.987026 kernel: ACPI: Added _OSI(Module Device) Apr 24 23:36:08.987031 kernel: ACPI: Added _OSI(Processor Device) Apr 24 23:36:08.987037 kernel: ACPI: Added _OSI(Processor Aggregator Device) Apr 24 23:36:08.987042 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Apr 24 23:36:08.987049 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Apr 24 23:36:08.987054 kernel: ACPI: Interpreter enabled Apr 24 23:36:08.987060 kernel: ACPI: PM: (supports S0 S5) Apr 24 23:36:08.987065 kernel: ACPI: Using IOAPIC for interrupt routing Apr 24 23:36:08.987070 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Apr 24 23:36:08.987075 kernel: PCI: Using E820 reservations for host bridge windows Apr 24 23:36:08.987081 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Apr 24 23:36:08.987086 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Apr 24 23:36:08.987238 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Apr 24 23:36:08.987348 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Apr 24 23:36:08.990492 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Apr 24 23:36:08.990505 kernel: PCI host bridge to bus 0000:00 Apr 24 23:36:08.990634 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Apr 24 23:36:08.990727 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Apr 24 23:36:08.990834 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Apr 24 23:36:08.990928 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Apr 24 23:36:08.991015 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Apr 24 23:36:08.991102 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Apr 24 23:36:08.991191 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Apr 24 23:36:08.991300 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Apr 24 23:36:08.991405 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 Apr 24 23:36:08.991767 kernel: pci 0000:00:01.0: reg 0x10: [mem 0x80000000-0x807fffff pref] Apr 24 23:36:08.991882 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xc060500000-0xc060503fff 64bit pref] Apr 24 23:36:08.991980 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8138a000-0x8138afff] Apr 24 23:36:08.992077 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xffff0000-0xffffffff pref] Apr 24 23:36:08.992175 kernel: pci 0000:00:01.0: BAR 0: assigned to efifb Apr 24 23:36:08.992272 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Apr 24 23:36:08.992399 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Apr 24 23:36:08.992547 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x81389000-0x81389fff] Apr 24 23:36:08.992655 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Apr 24 23:36:08.992751 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x81388000-0x81388fff] Apr 24 23:36:08.992897 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Apr 24 23:36:08.992996 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x81387000-0x81387fff] Apr 24 23:36:08.993098 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Apr 24 23:36:08.993197 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x81386000-0x81386fff] Apr 24 23:36:08.993299 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Apr 24 23:36:08.993395 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x81385000-0x81385fff] Apr 24 23:36:08.995871 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Apr 24 23:36:08.995977 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x81384000-0x81384fff] Apr 24 23:36:08.996084 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Apr 24 23:36:08.996180 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x81383000-0x81383fff] Apr 24 23:36:08.996286 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Apr 24 23:36:08.996382 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x81382000-0x81382fff] Apr 24 23:36:08.996510 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Apr 24 23:36:08.996606 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x81381000-0x81381fff] Apr 24 23:36:08.996707 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Apr 24 23:36:08.996801 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Apr 24 23:36:08.996914 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Apr 24 23:36:08.997011 kernel: pci 0000:00:1f.2: reg 0x20: [io 0x6040-0x605f] Apr 24 23:36:08.997112 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0x81380000-0x81380fff] Apr 24 23:36:08.997213 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Apr 24 23:36:08.997310 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x6000-0x603f] Apr 24 23:36:08.998278 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Apr 24 23:36:08.998402 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x81200000-0x81200fff] Apr 24 23:36:08.998521 kernel: pci 0000:01:00.0: reg 0x20: [mem 0xc060000000-0xc060003fff 64bit pref] Apr 24 23:36:08.998623 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Apr 24 23:36:08.998718 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Apr 24 23:36:08.998814 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Apr 24 23:36:08.998919 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Apr 24 23:36:08.999029 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Apr 24 23:36:08.999133 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x81100000-0x81103fff 64bit] Apr 24 23:36:08.999228 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Apr 24 23:36:08.999323 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Apr 24 23:36:09.000483 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Apr 24 23:36:09.000643 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x81000000-0x81000fff] Apr 24 23:36:09.000792 kernel: pci 0000:03:00.0: reg 0x20: [mem 0xc060100000-0xc060103fff 64bit pref] Apr 24 23:36:09.000946 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Apr 24 23:36:09.001089 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Apr 24 23:36:09.001221 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Apr 24 23:36:09.001338 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Apr 24 23:36:09.002574 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xc060200000-0xc060203fff 64bit pref] Apr 24 23:36:09.002724 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Apr 24 23:36:09.002871 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Apr 24 23:36:09.003022 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Apr 24 23:36:09.003167 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x80f00000-0x80f00fff] Apr 24 23:36:09.003309 kernel: pci 0000:05:00.0: reg 0x20: [mem 0xc060300000-0xc060303fff 64bit pref] Apr 24 23:36:09.005186 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Apr 24 23:36:09.005333 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Apr 24 23:36:09.006578 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Apr 24 23:36:09.006740 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Apr 24 23:36:09.006898 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x80e00000-0x80e00fff] Apr 24 23:36:09.007058 kernel: pci 0000:06:00.0: reg 0x20: [mem 0xc060400000-0xc060403fff 64bit pref] Apr 24 23:36:09.007195 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Apr 24 23:36:09.007308 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Apr 24 23:36:09.011244 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Apr 24 23:36:09.011320 kernel: acpiphp: Slot [0] registered Apr 24 23:36:09.011891 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Apr 24 23:36:09.012479 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x80c00000-0x80c00fff] Apr 24 23:36:09.012602 kernel: pci 0000:07:00.0: reg 0x20: [mem 0xc000000000-0xc000003fff 64bit pref] Apr 24 23:36:09.012715 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Apr 24 23:36:09.012829 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Apr 24 23:36:09.012932 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Apr 24 23:36:09.013031 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Apr 24 23:36:09.013038 kernel: acpiphp: Slot [0-2] registered Apr 24 23:36:09.013137 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Apr 24 23:36:09.013236 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Apr 24 23:36:09.013335 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Apr 24 23:36:09.013346 kernel: acpiphp: Slot [0-3] registered Apr 24 23:36:09.013485 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Apr 24 23:36:09.013586 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Apr 24 23:36:09.013686 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Apr 24 23:36:09.013692 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Apr 24 23:36:09.013698 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Apr 24 23:36:09.013703 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Apr 24 23:36:09.013708 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Apr 24 23:36:09.013714 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Apr 24 23:36:09.013723 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Apr 24 23:36:09.013728 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Apr 24 23:36:09.013733 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Apr 24 23:36:09.013738 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Apr 24 23:36:09.013743 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Apr 24 23:36:09.013749 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Apr 24 23:36:09.013754 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Apr 24 23:36:09.013759 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Apr 24 23:36:09.013764 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Apr 24 23:36:09.013772 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Apr 24 23:36:09.013777 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Apr 24 23:36:09.013782 kernel: iommu: Default domain type: Translated Apr 24 23:36:09.013787 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Apr 24 23:36:09.013792 kernel: efivars: Registered efivars operations Apr 24 23:36:09.013797 kernel: PCI: Using ACPI for IRQ routing Apr 24 23:36:09.013803 kernel: PCI: pci_cache_line_size set to 64 bytes Apr 24 23:36:09.013808 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Apr 24 23:36:09.013826 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Apr 24 23:36:09.013831 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Apr 24 23:36:09.013836 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Apr 24 23:36:09.013938 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Apr 24 23:36:09.014995 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Apr 24 23:36:09.015161 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Apr 24 23:36:09.015174 kernel: vgaarb: loaded Apr 24 23:36:09.015183 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Apr 24 23:36:09.015191 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Apr 24 23:36:09.015207 kernel: clocksource: Switched to clocksource kvm-clock Apr 24 23:36:09.015216 kernel: VFS: Disk quotas dquot_6.6.0 Apr 24 23:36:09.015224 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Apr 24 23:36:09.015229 kernel: pnp: PnP ACPI init Apr 24 23:36:09.015351 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Apr 24 23:36:09.015360 kernel: pnp: PnP ACPI: found 5 devices Apr 24 23:36:09.015366 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Apr 24 23:36:09.015372 kernel: NET: Registered PF_INET protocol family Apr 24 23:36:09.015400 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Apr 24 23:36:09.015430 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Apr 24 23:36:09.015437 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Apr 24 23:36:09.015443 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Apr 24 23:36:09.015448 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Apr 24 23:36:09.015454 kernel: TCP: Hash tables configured (established 32768 bind 32768) Apr 24 23:36:09.015460 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Apr 24 23:36:09.015465 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Apr 24 23:36:09.015471 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Apr 24 23:36:09.015481 kernel: NET: Registered PF_XDP protocol family Apr 24 23:36:09.015604 kernel: pci 0000:01:00.0: can't claim BAR 6 [mem 0xfff80000-0xffffffff pref]: no compatible bridge window Apr 24 23:36:09.015732 kernel: pci 0000:07:00.0: can't claim BAR 6 [mem 0xfff80000-0xffffffff pref]: no compatible bridge window Apr 24 23:36:09.015862 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Apr 24 23:36:09.015983 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Apr 24 23:36:09.016104 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Apr 24 23:36:09.016209 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x1000-0x1fff] Apr 24 23:36:09.016316 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x2000-0x2fff] Apr 24 23:36:09.020235 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x3000-0x3fff] Apr 24 23:36:09.020358 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x81280000-0x812fffff pref] Apr 24 23:36:09.020477 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Apr 24 23:36:09.020608 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Apr 24 23:36:09.020720 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Apr 24 23:36:09.020848 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Apr 24 23:36:09.020963 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Apr 24 23:36:09.021077 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Apr 24 23:36:09.021190 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Apr 24 23:36:09.021288 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Apr 24 23:36:09.021385 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Apr 24 23:36:09.021494 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Apr 24 23:36:09.021612 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Apr 24 23:36:09.021710 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Apr 24 23:36:09.021804 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Apr 24 23:36:09.021925 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Apr 24 23:36:09.022025 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Apr 24 23:36:09.022138 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Apr 24 23:36:09.022261 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x80c80000-0x80cfffff pref] Apr 24 23:36:09.022380 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Apr 24 23:36:09.022501 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Apr 24 23:36:09.022663 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Apr 24 23:36:09.022762 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Apr 24 23:36:09.022867 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Apr 24 23:36:09.022979 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Apr 24 23:36:09.023086 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Apr 24 23:36:09.023182 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Apr 24 23:36:09.023296 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Apr 24 23:36:09.023402 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Apr 24 23:36:09.023573 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Apr 24 23:36:09.023681 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Apr 24 23:36:09.023782 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Apr 24 23:36:09.023904 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Apr 24 23:36:09.024002 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Apr 24 23:36:09.024103 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Apr 24 23:36:09.024196 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Apr 24 23:36:09.024298 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Apr 24 23:36:09.027471 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Apr 24 23:36:09.027593 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Apr 24 23:36:09.027701 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Apr 24 23:36:09.027805 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Apr 24 23:36:09.027908 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Apr 24 23:36:09.028006 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Apr 24 23:36:09.028129 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Apr 24 23:36:09.028224 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Apr 24 23:36:09.028335 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Apr 24 23:36:09.028464 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Apr 24 23:36:09.028588 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Apr 24 23:36:09.028684 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Apr 24 23:36:09.028788 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Apr 24 23:36:09.028907 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Apr 24 23:36:09.029014 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Apr 24 23:36:09.029108 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Apr 24 23:36:09.029228 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Apr 24 23:36:09.029336 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Apr 24 23:36:09.031274 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Apr 24 23:36:09.031293 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Apr 24 23:36:09.031300 kernel: PCI: CLS 0 bytes, default 64 Apr 24 23:36:09.031306 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Apr 24 23:36:09.031312 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Apr 24 23:36:09.031318 kernel: Initialise system trusted keyrings Apr 24 23:36:09.031328 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Apr 24 23:36:09.031333 kernel: Key type asymmetric registered Apr 24 23:36:09.031339 kernel: Asymmetric key parser 'x509' registered Apr 24 23:36:09.031345 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Apr 24 23:36:09.031351 kernel: io scheduler mq-deadline registered Apr 24 23:36:09.031356 kernel: io scheduler kyber registered Apr 24 23:36:09.031362 kernel: io scheduler bfq registered Apr 24 23:36:09.031555 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Apr 24 23:36:09.031659 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Apr 24 23:36:09.031778 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Apr 24 23:36:09.031892 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Apr 24 23:36:09.031993 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Apr 24 23:36:09.032105 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Apr 24 23:36:09.032239 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Apr 24 23:36:09.032354 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Apr 24 23:36:09.032468 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Apr 24 23:36:09.032567 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Apr 24 23:36:09.032671 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Apr 24 23:36:09.032767 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Apr 24 23:36:09.032879 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Apr 24 23:36:09.032976 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Apr 24 23:36:09.033071 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Apr 24 23:36:09.033166 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Apr 24 23:36:09.033177 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Apr 24 23:36:09.033317 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Apr 24 23:36:09.033572 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Apr 24 23:36:09.033589 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Apr 24 23:36:09.033598 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Apr 24 23:36:09.033607 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Apr 24 23:36:09.033615 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Apr 24 23:36:09.033624 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Apr 24 23:36:09.033633 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Apr 24 23:36:09.033642 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Apr 24 23:36:09.033786 kernel: rtc_cmos 00:03: RTC can wake from S4 Apr 24 23:36:09.033805 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Apr 24 23:36:09.033943 kernel: rtc_cmos 00:03: registered as rtc0 Apr 24 23:36:09.034059 kernel: rtc_cmos 00:03: setting system clock to 2026-04-24T23:36:08 UTC (1777073768) Apr 24 23:36:09.034175 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Apr 24 23:36:09.034184 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Apr 24 23:36:09.034193 kernel: efifb: probing for efifb Apr 24 23:36:09.034202 kernel: efifb: framebuffer at 0x80000000, using 4032k, total 4032k Apr 24 23:36:09.034210 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Apr 24 23:36:09.034222 kernel: efifb: scrolling: redraw Apr 24 23:36:09.034231 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Apr 24 23:36:09.034242 kernel: Console: switching to colour frame buffer device 160x50 Apr 24 23:36:09.034250 kernel: fb0: EFI VGA frame buffer device Apr 24 23:36:09.034258 kernel: pstore: Using crash dump compression: deflate Apr 24 23:36:09.034266 kernel: pstore: Registered efi_pstore as persistent store backend Apr 24 23:36:09.034275 kernel: NET: Registered PF_INET6 protocol family Apr 24 23:36:09.034283 kernel: Segment Routing with IPv6 Apr 24 23:36:09.034291 kernel: In-situ OAM (IOAM) with IPv6 Apr 24 23:36:09.034302 kernel: NET: Registered PF_PACKET protocol family Apr 24 23:36:09.034311 kernel: Key type dns_resolver registered Apr 24 23:36:09.034320 kernel: IPI shorthand broadcast: enabled Apr 24 23:36:09.034328 kernel: sched_clock: Marking stable (1396009000, 222194219)->(1670297262, -52094043) Apr 24 23:36:09.034337 kernel: registered taskstats version 1 Apr 24 23:36:09.034346 kernel: Loading compiled-in X.509 certificates Apr 24 23:36:09.034355 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: 507f116e6718ec7535b55c873de10edf9b6fe124' Apr 24 23:36:09.034363 kernel: Key type .fscrypt registered Apr 24 23:36:09.034371 kernel: Key type fscrypt-provisioning registered Apr 24 23:36:09.034382 kernel: ima: No TPM chip found, activating TPM-bypass! Apr 24 23:36:09.034391 kernel: ima: Allocated hash algorithm: sha1 Apr 24 23:36:09.034400 kernel: ima: No architecture policies found Apr 24 23:36:09.034422 kernel: clk: Disabling unused clocks Apr 24 23:36:09.034431 kernel: Freeing unused kernel image (initmem) memory: 42896K Apr 24 23:36:09.034440 kernel: Write protecting the kernel read-only data: 36864k Apr 24 23:36:09.034448 kernel: Freeing unused kernel image (rodata/data gap) memory: 1824K Apr 24 23:36:09.034457 kernel: Run /init as init process Apr 24 23:36:09.034465 kernel: with arguments: Apr 24 23:36:09.034478 kernel: /init Apr 24 23:36:09.034486 kernel: with environment: Apr 24 23:36:09.034495 kernel: HOME=/ Apr 24 23:36:09.034503 kernel: TERM=linux Apr 24 23:36:09.034514 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 24 23:36:09.034524 systemd[1]: Detected virtualization kvm. Apr 24 23:36:09.034534 systemd[1]: Detected architecture x86-64. Apr 24 23:36:09.034546 systemd[1]: Running in initrd. Apr 24 23:36:09.034555 systemd[1]: No hostname configured, using default hostname. Apr 24 23:36:09.034563 systemd[1]: Hostname set to . Apr 24 23:36:09.034571 systemd[1]: Initializing machine ID from VM UUID. Apr 24 23:36:09.034579 systemd[1]: Queued start job for default target initrd.target. Apr 24 23:36:09.034587 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 24 23:36:09.034595 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 24 23:36:09.034603 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Apr 24 23:36:09.034615 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 24 23:36:09.034623 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Apr 24 23:36:09.034632 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Apr 24 23:36:09.034641 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Apr 24 23:36:09.034649 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Apr 24 23:36:09.034657 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 24 23:36:09.034666 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 24 23:36:09.034678 systemd[1]: Reached target paths.target - Path Units. Apr 24 23:36:09.034686 systemd[1]: Reached target slices.target - Slice Units. Apr 24 23:36:09.034695 systemd[1]: Reached target swap.target - Swaps. Apr 24 23:36:09.034703 systemd[1]: Reached target timers.target - Timer Units. Apr 24 23:36:09.034710 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Apr 24 23:36:09.034719 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 24 23:36:09.034727 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Apr 24 23:36:09.034735 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Apr 24 23:36:09.034746 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 24 23:36:09.034753 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 24 23:36:09.034762 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 24 23:36:09.034769 systemd[1]: Reached target sockets.target - Socket Units. Apr 24 23:36:09.034781 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Apr 24 23:36:09.034789 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 24 23:36:09.034797 systemd[1]: Finished network-cleanup.service - Network Cleanup. Apr 24 23:36:09.034805 systemd[1]: Starting systemd-fsck-usr.service... Apr 24 23:36:09.034814 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 24 23:36:09.034835 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 24 23:36:09.034844 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:36:09.034875 systemd-journald[188]: Collecting audit messages is disabled. Apr 24 23:36:09.034895 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Apr 24 23:36:09.034906 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 24 23:36:09.034914 systemd[1]: Finished systemd-fsck-usr.service. Apr 24 23:36:09.034922 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 24 23:36:09.034931 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Apr 24 23:36:09.034942 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:36:09.034949 kernel: Bridge firewalling registered Apr 24 23:36:09.034957 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 24 23:36:09.034966 systemd-journald[188]: Journal started Apr 24 23:36:09.034987 systemd-journald[188]: Runtime Journal (/run/log/journal/77bb39b3f7464047bc97ad12e8ae1e04) is 8.0M, max 76.3M, 68.3M free. Apr 24 23:36:08.985853 systemd-modules-load[189]: Inserted module 'overlay' Apr 24 23:36:09.038761 systemd[1]: Started systemd-journald.service - Journal Service. Apr 24 23:36:09.030574 systemd-modules-load[189]: Inserted module 'br_netfilter' Apr 24 23:36:09.039741 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 24 23:36:09.048795 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 24 23:36:09.055585 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 24 23:36:09.059565 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 24 23:36:09.064052 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 24 23:36:09.069332 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 24 23:36:09.077750 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:36:09.078380 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 24 23:36:09.087593 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Apr 24 23:36:09.090286 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 24 23:36:09.094338 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 24 23:36:09.098458 dracut-cmdline[221]: dracut-dracut-053 Apr 24 23:36:09.101528 dracut-cmdline[221]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=c8442747465ed99a522e07b8746f6a7817fb39c2025d7438698e3b90e9c0defb Apr 24 23:36:09.128910 systemd-resolved[228]: Positive Trust Anchors: Apr 24 23:36:09.128921 systemd-resolved[228]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 24 23:36:09.128944 systemd-resolved[228]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 24 23:36:09.133152 systemd-resolved[228]: Defaulting to hostname 'linux'. Apr 24 23:36:09.135432 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 24 23:36:09.136430 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 24 23:36:09.173429 kernel: SCSI subsystem initialized Apr 24 23:36:09.180437 kernel: Loading iSCSI transport class v2.0-870. Apr 24 23:36:09.189438 kernel: iscsi: registered transport (tcp) Apr 24 23:36:09.207231 kernel: iscsi: registered transport (qla4xxx) Apr 24 23:36:09.207299 kernel: QLogic iSCSI HBA Driver Apr 24 23:36:09.250401 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Apr 24 23:36:09.255539 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Apr 24 23:36:09.280097 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Apr 24 23:36:09.280156 kernel: device-mapper: uevent: version 1.0.3 Apr 24 23:36:09.280166 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Apr 24 23:36:09.320445 kernel: raid6: avx512x4 gen() 48008 MB/s Apr 24 23:36:09.338448 kernel: raid6: avx512x2 gen() 51955 MB/s Apr 24 23:36:09.356446 kernel: raid6: avx512x1 gen() 48262 MB/s Apr 24 23:36:09.374438 kernel: raid6: avx2x4 gen() 53400 MB/s Apr 24 23:36:09.392434 kernel: raid6: avx2x2 gen() 55880 MB/s Apr 24 23:36:09.411591 kernel: raid6: avx2x1 gen() 46943 MB/s Apr 24 23:36:09.411650 kernel: raid6: using algorithm avx2x2 gen() 55880 MB/s Apr 24 23:36:09.431519 kernel: raid6: .... xor() 37226 MB/s, rmw enabled Apr 24 23:36:09.431565 kernel: raid6: using avx512x2 recovery algorithm Apr 24 23:36:09.448446 kernel: xor: automatically using best checksumming function avx Apr 24 23:36:09.560442 kernel: Btrfs loaded, zoned=no, fsverity=no Apr 24 23:36:09.577551 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Apr 24 23:36:09.585657 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 24 23:36:09.599267 systemd-udevd[407]: Using default interface naming scheme 'v255'. Apr 24 23:36:09.603122 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 24 23:36:09.611667 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Apr 24 23:36:09.641469 dracut-pre-trigger[415]: rd.md=0: removing MD RAID activation Apr 24 23:36:09.683399 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Apr 24 23:36:09.688534 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 24 23:36:09.774561 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 24 23:36:09.786376 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Apr 24 23:36:09.810485 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Apr 24 23:36:09.811401 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Apr 24 23:36:09.813204 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 24 23:36:09.814074 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 24 23:36:09.819551 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Apr 24 23:36:09.829852 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Apr 24 23:36:09.869429 kernel: cryptd: max_cpu_qlen set to 1000 Apr 24 23:36:09.877425 kernel: scsi host0: Virtio SCSI HBA Apr 24 23:36:09.877016 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 24 23:36:09.877114 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:36:09.877675 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 24 23:36:09.878985 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 24 23:36:09.879119 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:36:09.879486 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:36:09.890528 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Apr 24 23:36:09.889584 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:36:09.902427 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:36:09.910243 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 24 23:36:09.920426 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 24 23:36:09.921083 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:36:09.921742 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 24 23:36:09.921816 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:36:09.922219 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:36:09.931926 kernel: AVX2 version of gcm_enc/dec engaged. Apr 24 23:36:09.931136 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:36:09.940423 kernel: ACPI: bus type USB registered Apr 24 23:36:09.943725 kernel: AES CTR mode by8 optimization enabled Apr 24 23:36:09.943745 kernel: usbcore: registered new interface driver usbfs Apr 24 23:36:09.946650 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:36:09.951943 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 24 23:36:09.965485 kernel: usbcore: registered new interface driver hub Apr 24 23:36:09.965520 kernel: usbcore: registered new device driver usb Apr 24 23:36:09.966436 kernel: libata version 3.00 loaded. Apr 24 23:36:09.989149 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:36:09.992445 kernel: ahci 0000:00:1f.2: version 3.0 Apr 24 23:36:09.992620 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Apr 24 23:36:09.996107 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Apr 24 23:36:09.996272 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Apr 24 23:36:09.996282 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Apr 24 23:36:10.002484 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Apr 24 23:36:10.002691 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Apr 24 23:36:10.002815 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Apr 24 23:36:10.002964 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Apr 24 23:36:10.003109 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Apr 24 23:36:10.009447 kernel: sd 0:0:0:0: Power-on or device reset occurred Apr 24 23:36:10.010276 kernel: hub 1-0:1.0: USB hub found Apr 24 23:36:10.010459 kernel: sd 0:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Apr 24 23:36:10.012690 kernel: hub 1-0:1.0: 4 ports detected Apr 24 23:36:10.012851 kernel: sd 0:0:0:0: [sda] Write Protect is off Apr 24 23:36:10.014573 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Apr 24 23:36:10.014608 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Apr 24 23:36:10.015816 kernel: hub 2-0:1.0: USB hub found Apr 24 23:36:10.015975 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Apr 24 23:36:10.018523 kernel: hub 2-0:1.0: 4 ports detected Apr 24 23:36:10.018681 kernel: scsi host1: ahci Apr 24 23:36:10.024482 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Apr 24 23:36:10.025431 kernel: scsi host2: ahci Apr 24 23:36:10.025633 kernel: GPT:17805311 != 160006143 Apr 24 23:36:10.028589 kernel: scsi host3: ahci Apr 24 23:36:10.028757 kernel: GPT:Alternate GPT header not at the end of the disk. Apr 24 23:36:10.028767 kernel: GPT:17805311 != 160006143 Apr 24 23:36:10.029706 kernel: scsi host4: ahci Apr 24 23:36:10.029867 kernel: GPT: Use GNU Parted to correct GPT errors. Apr 24 23:36:10.033428 kernel: scsi host5: ahci Apr 24 23:36:10.033592 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 24 23:36:10.036098 kernel: scsi host6: ahci Apr 24 23:36:10.036139 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Apr 24 23:36:10.047434 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 51 Apr 24 23:36:10.047460 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 51 Apr 24 23:36:10.049430 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 51 Apr 24 23:36:10.054353 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 51 Apr 24 23:36:10.054379 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 51 Apr 24 23:36:10.057704 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 51 Apr 24 23:36:10.076639 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (460) Apr 24 23:36:10.076674 kernel: BTRFS: device fsid 077bb4ac-fe88-409a-8f61-fdf28cadf681 devid 1 transid 31 /dev/sda3 scanned by (udev-worker) (474) Apr 24 23:36:10.088808 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Apr 24 23:36:10.092874 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Apr 24 23:36:10.096319 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Apr 24 23:36:10.096997 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Apr 24 23:36:10.100867 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Apr 24 23:36:10.110535 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Apr 24 23:36:10.115680 disk-uuid[594]: Primary Header is updated. Apr 24 23:36:10.115680 disk-uuid[594]: Secondary Entries is updated. Apr 24 23:36:10.115680 disk-uuid[594]: Secondary Header is updated. Apr 24 23:36:10.120437 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 24 23:36:10.126439 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 24 23:36:10.253432 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Apr 24 23:36:10.374442 kernel: ata6: SATA link down (SStatus 0 SControl 300) Apr 24 23:36:10.374532 kernel: ata3: SATA link down (SStatus 0 SControl 300) Apr 24 23:36:10.388771 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Apr 24 23:36:10.388825 kernel: ata5: SATA link down (SStatus 0 SControl 300) Apr 24 23:36:10.394463 kernel: ata2: SATA link down (SStatus 0 SControl 300) Apr 24 23:36:10.400510 kernel: ata4: SATA link down (SStatus 0 SControl 300) Apr 24 23:36:10.406454 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Apr 24 23:36:10.406498 kernel: ata1.00: applying bridge limits Apr 24 23:36:10.409545 kernel: ata1.00: configured for UDMA/100 Apr 24 23:36:10.419617 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Apr 24 23:36:10.425468 kernel: hid: raw HID events driver (C) Jiri Kosina Apr 24 23:36:10.456752 kernel: usbcore: registered new interface driver usbhid Apr 24 23:36:10.456829 kernel: usbhid: USB HID core driver Apr 24 23:36:10.475376 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Apr 24 23:36:10.475463 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Apr 24 23:36:10.475823 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Apr 24 23:36:10.477538 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Apr 24 23:36:10.490511 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Apr 24 23:36:11.133498 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 24 23:36:11.137016 disk-uuid[595]: The operation has completed successfully. Apr 24 23:36:11.201861 systemd[1]: disk-uuid.service: Deactivated successfully. Apr 24 23:36:11.201958 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Apr 24 23:36:11.221549 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Apr 24 23:36:11.224647 sh[614]: Success Apr 24 23:36:11.236448 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Apr 24 23:36:11.271791 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Apr 24 23:36:11.280404 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Apr 24 23:36:11.280981 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Apr 24 23:36:11.298350 kernel: BTRFS info (device dm-0): first mount of filesystem 077bb4ac-fe88-409a-8f61-fdf28cadf681 Apr 24 23:36:11.298393 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Apr 24 23:36:11.298402 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Apr 24 23:36:11.302879 kernel: BTRFS info (device dm-0): disabling log replay at mount time Apr 24 23:36:11.302906 kernel: BTRFS info (device dm-0): using free space tree Apr 24 23:36:11.313425 kernel: BTRFS info (device dm-0): enabling ssd optimizations Apr 24 23:36:11.315648 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Apr 24 23:36:11.316530 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Apr 24 23:36:11.320542 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Apr 24 23:36:11.321437 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Apr 24 23:36:11.336432 kernel: BTRFS info (device sda6): first mount of filesystem 926930fb-88b5-4cf4-bdd1-3374ab036b7b Apr 24 23:36:11.336464 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Apr 24 23:36:11.338694 kernel: BTRFS info (device sda6): using free space tree Apr 24 23:36:11.346562 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 24 23:36:11.346586 kernel: BTRFS info (device sda6): auto enabling async discard Apr 24 23:36:11.358891 kernel: BTRFS info (device sda6): last unmount of filesystem 926930fb-88b5-4cf4-bdd1-3374ab036b7b Apr 24 23:36:11.358617 systemd[1]: mnt-oem.mount: Deactivated successfully. Apr 24 23:36:11.365883 systemd[1]: Finished ignition-setup.service - Ignition (setup). Apr 24 23:36:11.371549 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Apr 24 23:36:11.426364 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 24 23:36:11.434540 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 24 23:36:11.435009 ignition[730]: Ignition 2.19.0 Apr 24 23:36:11.435018 ignition[730]: Stage: fetch-offline Apr 24 23:36:11.435051 ignition[730]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:36:11.435059 ignition[730]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:36:11.435156 ignition[730]: parsed url from cmdline: "" Apr 24 23:36:11.435160 ignition[730]: no config URL provided Apr 24 23:36:11.435167 ignition[730]: reading system config file "/usr/lib/ignition/user.ign" Apr 24 23:36:11.435175 ignition[730]: no config at "/usr/lib/ignition/user.ign" Apr 24 23:36:11.435179 ignition[730]: failed to fetch config: resource requires networking Apr 24 23:36:11.436292 ignition[730]: Ignition finished successfully Apr 24 23:36:11.438653 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Apr 24 23:36:11.455762 systemd-networkd[799]: lo: Link UP Apr 24 23:36:11.455772 systemd-networkd[799]: lo: Gained carrier Apr 24 23:36:11.458115 systemd-networkd[799]: Enumeration completed Apr 24 23:36:11.458293 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 24 23:36:11.458760 systemd[1]: Reached target network.target - Network. Apr 24 23:36:11.458950 systemd-networkd[799]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:36:11.458953 systemd-networkd[799]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 24 23:36:11.459910 systemd-networkd[799]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:36:11.459914 systemd-networkd[799]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 24 23:36:11.460776 systemd-networkd[799]: eth0: Link UP Apr 24 23:36:11.460780 systemd-networkd[799]: eth0: Gained carrier Apr 24 23:36:11.460786 systemd-networkd[799]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:36:11.463868 systemd-networkd[799]: eth1: Link UP Apr 24 23:36:11.463872 systemd-networkd[799]: eth1: Gained carrier Apr 24 23:36:11.463878 systemd-networkd[799]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:36:11.470544 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Apr 24 23:36:11.480674 ignition[803]: Ignition 2.19.0 Apr 24 23:36:11.480683 ignition[803]: Stage: fetch Apr 24 23:36:11.480810 ignition[803]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:36:11.480823 ignition[803]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:36:11.480911 ignition[803]: parsed url from cmdline: "" Apr 24 23:36:11.480915 ignition[803]: no config URL provided Apr 24 23:36:11.480920 ignition[803]: reading system config file "/usr/lib/ignition/user.ign" Apr 24 23:36:11.480927 ignition[803]: no config at "/usr/lib/ignition/user.ign" Apr 24 23:36:11.480942 ignition[803]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Apr 24 23:36:11.481075 ignition[803]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Apr 24 23:36:11.502463 systemd-networkd[799]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Apr 24 23:36:11.531460 systemd-networkd[799]: eth0: DHCPv4 address 65.21.181.31/32, gateway 172.31.1.1 acquired from 172.31.1.1 Apr 24 23:36:11.681225 ignition[803]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Apr 24 23:36:11.688688 ignition[803]: GET result: OK Apr 24 23:36:11.688871 ignition[803]: parsing config with SHA512: 5b58f4ad8ee845669f9cfa75519afb0a379754718319a32663f00dc0ba2278a63601e5c5dd9888b36b440377c881ef970ca523d32ba318d68940fdc7d016d45c Apr 24 23:36:11.695610 unknown[803]: fetched base config from "system" Apr 24 23:36:11.696288 unknown[803]: fetched base config from "system" Apr 24 23:36:11.696303 unknown[803]: fetched user config from "hetzner" Apr 24 23:36:11.697042 ignition[803]: fetch: fetch complete Apr 24 23:36:11.697058 ignition[803]: fetch: fetch passed Apr 24 23:36:11.697172 ignition[803]: Ignition finished successfully Apr 24 23:36:11.702025 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Apr 24 23:36:11.710689 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Apr 24 23:36:11.727791 ignition[810]: Ignition 2.19.0 Apr 24 23:36:11.728455 ignition[810]: Stage: kargs Apr 24 23:36:11.728629 ignition[810]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:36:11.728640 ignition[810]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:36:11.729268 ignition[810]: kargs: kargs passed Apr 24 23:36:11.729308 ignition[810]: Ignition finished successfully Apr 24 23:36:11.733224 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Apr 24 23:36:11.740679 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Apr 24 23:36:11.753777 ignition[817]: Ignition 2.19.0 Apr 24 23:36:11.754375 ignition[817]: Stage: disks Apr 24 23:36:11.754543 ignition[817]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:36:11.754553 ignition[817]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:36:11.755177 ignition[817]: disks: disks passed Apr 24 23:36:11.755214 ignition[817]: Ignition finished successfully Apr 24 23:36:11.757686 systemd[1]: Finished ignition-disks.service - Ignition (disks). Apr 24 23:36:11.758821 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Apr 24 23:36:11.759561 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Apr 24 23:36:11.759919 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 24 23:36:11.760986 systemd[1]: Reached target sysinit.target - System Initialization. Apr 24 23:36:11.762047 systemd[1]: Reached target basic.target - Basic System. Apr 24 23:36:11.769696 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Apr 24 23:36:11.789984 systemd-fsck[825]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Apr 24 23:36:11.794266 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Apr 24 23:36:11.799573 systemd[1]: Mounting sysroot.mount - /sysroot... Apr 24 23:36:11.877446 kernel: EXT4-fs (sda9): mounted filesystem ae73d4a7-3ef8-4c50-8348-4aeb952085ba r/w with ordered data mode. Quota mode: none. Apr 24 23:36:11.877890 systemd[1]: Mounted sysroot.mount - /sysroot. Apr 24 23:36:11.878717 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Apr 24 23:36:11.891495 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 24 23:36:11.893514 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Apr 24 23:36:11.896548 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Apr 24 23:36:11.897287 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Apr 24 23:36:11.897315 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Apr 24 23:36:11.903920 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Apr 24 23:36:11.905963 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (833) Apr 24 23:36:11.905980 kernel: BTRFS info (device sda6): first mount of filesystem 926930fb-88b5-4cf4-bdd1-3374ab036b7b Apr 24 23:36:11.906540 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Apr 24 23:36:11.917286 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Apr 24 23:36:11.917368 kernel: BTRFS info (device sda6): using free space tree Apr 24 23:36:11.926989 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 24 23:36:11.927016 kernel: BTRFS info (device sda6): auto enabling async discard Apr 24 23:36:11.930822 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 24 23:36:11.958863 coreos-metadata[835]: Apr 24 23:36:11.958 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Apr 24 23:36:11.960674 coreos-metadata[835]: Apr 24 23:36:11.959 INFO Fetch successful Apr 24 23:36:11.960674 coreos-metadata[835]: Apr 24 23:36:11.960 INFO wrote hostname ci-4081-3-6-n-6f01bfed3c to /sysroot/etc/hostname Apr 24 23:36:11.963166 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 24 23:36:11.963990 initrd-setup-root[861]: cut: /sysroot/etc/passwd: No such file or directory Apr 24 23:36:11.969304 initrd-setup-root[869]: cut: /sysroot/etc/group: No such file or directory Apr 24 23:36:11.974805 initrd-setup-root[876]: cut: /sysroot/etc/shadow: No such file or directory Apr 24 23:36:11.978388 initrd-setup-root[883]: cut: /sysroot/etc/gshadow: No such file or directory Apr 24 23:36:12.064516 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Apr 24 23:36:12.069487 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Apr 24 23:36:12.072829 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Apr 24 23:36:12.082455 kernel: BTRFS info (device sda6): last unmount of filesystem 926930fb-88b5-4cf4-bdd1-3374ab036b7b Apr 24 23:36:12.100845 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Apr 24 23:36:12.102043 ignition[954]: INFO : Ignition 2.19.0 Apr 24 23:36:12.102043 ignition[954]: INFO : Stage: mount Apr 24 23:36:12.102043 ignition[954]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 24 23:36:12.102043 ignition[954]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:36:12.105061 ignition[954]: INFO : mount: mount passed Apr 24 23:36:12.105061 ignition[954]: INFO : Ignition finished successfully Apr 24 23:36:12.105824 systemd[1]: Finished ignition-mount.service - Ignition (mount). Apr 24 23:36:12.112517 systemd[1]: Starting ignition-files.service - Ignition (files)... Apr 24 23:36:12.294638 systemd[1]: sysroot-oem.mount: Deactivated successfully. Apr 24 23:36:12.298656 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 24 23:36:12.309453 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (967) Apr 24 23:36:12.317009 kernel: BTRFS info (device sda6): first mount of filesystem 926930fb-88b5-4cf4-bdd1-3374ab036b7b Apr 24 23:36:12.317033 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Apr 24 23:36:12.322529 kernel: BTRFS info (device sda6): using free space tree Apr 24 23:36:12.333816 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 24 23:36:12.333897 kernel: BTRFS info (device sda6): auto enabling async discard Apr 24 23:36:12.341717 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 24 23:36:12.376363 ignition[984]: INFO : Ignition 2.19.0 Apr 24 23:36:12.376363 ignition[984]: INFO : Stage: files Apr 24 23:36:12.378940 ignition[984]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 24 23:36:12.378940 ignition[984]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:36:12.378940 ignition[984]: DEBUG : files: compiled without relabeling support, skipping Apr 24 23:36:12.381975 ignition[984]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Apr 24 23:36:12.381975 ignition[984]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Apr 24 23:36:12.383967 ignition[984]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Apr 24 23:36:12.384880 ignition[984]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Apr 24 23:36:12.384880 ignition[984]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Apr 24 23:36:12.384583 unknown[984]: wrote ssh authorized keys file for user: core Apr 24 23:36:12.388562 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Apr 24 23:36:12.388562 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Apr 24 23:36:12.649923 systemd-networkd[799]: eth0: Gained IPv6LL Apr 24 23:36:12.785549 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Apr 24 23:36:13.214818 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Apr 24 23:36:13.215964 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Apr 24 23:36:13.215964 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Apr 24 23:36:13.215964 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Apr 24 23:36:13.215964 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Apr 24 23:36:13.215964 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 24 23:36:13.215964 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 24 23:36:13.215964 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 24 23:36:13.219369 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 24 23:36:13.219369 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Apr 24 23:36:13.219369 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Apr 24 23:36:13.219369 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Apr 24 23:36:13.219369 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Apr 24 23:36:13.219369 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Apr 24 23:36:13.219369 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-x86-64.raw: attempt #1 Apr 24 23:36:13.353593 systemd-networkd[799]: eth1: Gained IPv6LL Apr 24 23:36:13.567984 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Apr 24 23:36:13.839563 ignition[984]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Apr 24 23:36:13.839563 ignition[984]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Apr 24 23:36:13.841201 ignition[984]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 24 23:36:13.841201 ignition[984]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 24 23:36:13.841201 ignition[984]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Apr 24 23:36:13.841201 ignition[984]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Apr 24 23:36:13.841201 ignition[984]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Apr 24 23:36:13.841201 ignition[984]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Apr 24 23:36:13.841201 ignition[984]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Apr 24 23:36:13.841201 ignition[984]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Apr 24 23:36:13.841201 ignition[984]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Apr 24 23:36:13.841201 ignition[984]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Apr 24 23:36:13.848200 ignition[984]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Apr 24 23:36:13.848200 ignition[984]: INFO : files: files passed Apr 24 23:36:13.848200 ignition[984]: INFO : Ignition finished successfully Apr 24 23:36:13.843064 systemd[1]: Finished ignition-files.service - Ignition (files). Apr 24 23:36:13.852638 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Apr 24 23:36:13.854662 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Apr 24 23:36:13.858107 systemd[1]: ignition-quench.service: Deactivated successfully. Apr 24 23:36:13.858798 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Apr 24 23:36:13.875198 initrd-setup-root-after-ignition[1012]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 24 23:36:13.875198 initrd-setup-root-after-ignition[1012]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Apr 24 23:36:13.877936 initrd-setup-root-after-ignition[1016]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 24 23:36:13.879090 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 24 23:36:13.880300 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Apr 24 23:36:13.885567 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Apr 24 23:36:13.929952 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Apr 24 23:36:13.930061 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Apr 24 23:36:13.930698 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Apr 24 23:36:13.931601 systemd[1]: Reached target initrd.target - Initrd Default Target. Apr 24 23:36:13.932602 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Apr 24 23:36:13.938567 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Apr 24 23:36:13.952684 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 24 23:36:13.957546 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Apr 24 23:36:13.966844 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Apr 24 23:36:13.967301 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 24 23:36:13.967760 systemd[1]: Stopped target timers.target - Timer Units. Apr 24 23:36:13.968543 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Apr 24 23:36:13.968624 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 24 23:36:13.969677 systemd[1]: Stopped target initrd.target - Initrd Default Target. Apr 24 23:36:13.970453 systemd[1]: Stopped target basic.target - Basic System. Apr 24 23:36:13.971307 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Apr 24 23:36:13.972208 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Apr 24 23:36:13.973065 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Apr 24 23:36:13.973927 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Apr 24 23:36:13.974712 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Apr 24 23:36:13.975459 systemd[1]: Stopped target sysinit.target - System Initialization. Apr 24 23:36:13.976198 systemd[1]: Stopped target local-fs.target - Local File Systems. Apr 24 23:36:13.977152 systemd[1]: Stopped target swap.target - Swaps. Apr 24 23:36:13.978001 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Apr 24 23:36:13.978096 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Apr 24 23:36:13.979246 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Apr 24 23:36:13.980134 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 24 23:36:13.980846 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Apr 24 23:36:13.980928 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 24 23:36:13.981633 systemd[1]: dracut-initqueue.service: Deactivated successfully. Apr 24 23:36:13.981734 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Apr 24 23:36:13.982748 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Apr 24 23:36:13.982839 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 24 23:36:13.983565 systemd[1]: ignition-files.service: Deactivated successfully. Apr 24 23:36:13.983641 systemd[1]: Stopped ignition-files.service - Ignition (files). Apr 24 23:36:13.984284 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Apr 24 23:36:13.984359 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 24 23:36:13.994897 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Apr 24 23:36:13.997565 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Apr 24 23:36:13.997968 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Apr 24 23:36:13.998083 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Apr 24 23:36:13.999612 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Apr 24 23:36:13.999715 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Apr 24 23:36:14.004654 systemd[1]: initrd-cleanup.service: Deactivated successfully. Apr 24 23:36:14.004749 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Apr 24 23:36:14.009444 ignition[1036]: INFO : Ignition 2.19.0 Apr 24 23:36:14.010623 ignition[1036]: INFO : Stage: umount Apr 24 23:36:14.010623 ignition[1036]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 24 23:36:14.010623 ignition[1036]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:36:14.013547 ignition[1036]: INFO : umount: umount passed Apr 24 23:36:14.013547 ignition[1036]: INFO : Ignition finished successfully Apr 24 23:36:14.016174 systemd[1]: ignition-mount.service: Deactivated successfully. Apr 24 23:36:14.016287 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Apr 24 23:36:14.017674 systemd[1]: ignition-disks.service: Deactivated successfully. Apr 24 23:36:14.017719 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Apr 24 23:36:14.018092 systemd[1]: ignition-kargs.service: Deactivated successfully. Apr 24 23:36:14.018128 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Apr 24 23:36:14.018468 systemd[1]: ignition-fetch.service: Deactivated successfully. Apr 24 23:36:14.018501 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Apr 24 23:36:14.018827 systemd[1]: Stopped target network.target - Network. Apr 24 23:36:14.019139 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Apr 24 23:36:14.019176 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Apr 24 23:36:14.021479 systemd[1]: Stopped target paths.target - Path Units. Apr 24 23:36:14.021848 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Apr 24 23:36:14.026483 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 24 23:36:14.026833 systemd[1]: Stopped target slices.target - Slice Units. Apr 24 23:36:14.027453 systemd[1]: Stopped target sockets.target - Socket Units. Apr 24 23:36:14.028088 systemd[1]: iscsid.socket: Deactivated successfully. Apr 24 23:36:14.028127 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Apr 24 23:36:14.028753 systemd[1]: iscsiuio.socket: Deactivated successfully. Apr 24 23:36:14.028788 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 24 23:36:14.029344 systemd[1]: ignition-setup.service: Deactivated successfully. Apr 24 23:36:14.029385 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Apr 24 23:36:14.029978 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Apr 24 23:36:14.030019 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Apr 24 23:36:14.030791 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Apr 24 23:36:14.031523 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Apr 24 23:36:14.033088 systemd[1]: sysroot-boot.mount: Deactivated successfully. Apr 24 23:36:14.033606 systemd[1]: sysroot-boot.service: Deactivated successfully. Apr 24 23:36:14.034137 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Apr 24 23:36:14.036283 systemd[1]: initrd-setup-root.service: Deactivated successfully. Apr 24 23:36:14.036364 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Apr 24 23:36:14.036729 systemd-networkd[799]: eth1: DHCPv6 lease lost Apr 24 23:36:14.037445 systemd[1]: systemd-resolved.service: Deactivated successfully. Apr 24 23:36:14.037551 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Apr 24 23:36:14.040072 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Apr 24 23:36:14.040138 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 24 23:36:14.040456 systemd-networkd[799]: eth0: DHCPv6 lease lost Apr 24 23:36:14.043397 systemd[1]: systemd-networkd.service: Deactivated successfully. Apr 24 23:36:14.043566 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Apr 24 23:36:14.044512 systemd[1]: systemd-networkd.socket: Deactivated successfully. Apr 24 23:36:14.044546 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Apr 24 23:36:14.049547 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Apr 24 23:36:14.049945 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Apr 24 23:36:14.050009 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 24 23:36:14.050450 systemd[1]: systemd-sysctl.service: Deactivated successfully. Apr 24 23:36:14.050497 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Apr 24 23:36:14.050920 systemd[1]: systemd-modules-load.service: Deactivated successfully. Apr 24 23:36:14.050969 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Apr 24 23:36:14.051570 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 24 23:36:14.069170 systemd[1]: network-cleanup.service: Deactivated successfully. Apr 24 23:36:14.069713 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Apr 24 23:36:14.071824 systemd[1]: systemd-udevd.service: Deactivated successfully. Apr 24 23:36:14.071988 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 24 23:36:14.073169 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Apr 24 23:36:14.073230 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Apr 24 23:36:14.073700 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Apr 24 23:36:14.073738 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Apr 24 23:36:14.074312 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Apr 24 23:36:14.074352 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Apr 24 23:36:14.075432 systemd[1]: dracut-cmdline.service: Deactivated successfully. Apr 24 23:36:14.075472 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Apr 24 23:36:14.076504 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 24 23:36:14.076548 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:36:14.087609 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Apr 24 23:36:14.088358 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Apr 24 23:36:14.088436 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 24 23:36:14.089291 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Apr 24 23:36:14.089333 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 24 23:36:14.090469 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Apr 24 23:36:14.090512 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Apr 24 23:36:14.090879 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 24 23:36:14.090913 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:36:14.095345 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Apr 24 23:36:14.095488 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Apr 24 23:36:14.096085 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Apr 24 23:36:14.101805 systemd[1]: Starting initrd-switch-root.service - Switch Root... Apr 24 23:36:14.108710 systemd[1]: Switching root. Apr 24 23:36:14.152562 systemd-journald[188]: Journal stopped Apr 24 23:36:15.161916 systemd-journald[188]: Received SIGTERM from PID 1 (systemd). Apr 24 23:36:15.161986 kernel: SELinux: policy capability network_peer_controls=1 Apr 24 23:36:15.161997 kernel: SELinux: policy capability open_perms=1 Apr 24 23:36:15.162005 kernel: SELinux: policy capability extended_socket_class=1 Apr 24 23:36:15.162014 kernel: SELinux: policy capability always_check_network=0 Apr 24 23:36:15.162032 kernel: SELinux: policy capability cgroup_seclabel=1 Apr 24 23:36:15.162052 kernel: SELinux: policy capability nnp_nosuid_transition=1 Apr 24 23:36:15.162060 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Apr 24 23:36:15.162069 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Apr 24 23:36:15.162077 kernel: audit: type=1403 audit(1777073774.283:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Apr 24 23:36:15.162087 systemd[1]: Successfully loaded SELinux policy in 45.786ms. Apr 24 23:36:15.162105 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 9.397ms. Apr 24 23:36:15.162114 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 24 23:36:15.162128 systemd[1]: Detected virtualization kvm. Apr 24 23:36:15.162138 systemd[1]: Detected architecture x86-64. Apr 24 23:36:15.162147 systemd[1]: Detected first boot. Apr 24 23:36:15.162156 systemd[1]: Hostname set to . Apr 24 23:36:15.162165 systemd[1]: Initializing machine ID from VM UUID. Apr 24 23:36:15.162173 zram_generator::config[1082]: No configuration found. Apr 24 23:36:15.162217 systemd[1]: Populated /etc with preset unit settings. Apr 24 23:36:15.162226 systemd[1]: initrd-switch-root.service: Deactivated successfully. Apr 24 23:36:15.163444 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Apr 24 23:36:15.163457 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Apr 24 23:36:15.163467 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Apr 24 23:36:15.163477 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Apr 24 23:36:15.163486 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Apr 24 23:36:15.163494 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Apr 24 23:36:15.163504 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Apr 24 23:36:15.163519 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Apr 24 23:36:15.163527 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Apr 24 23:36:15.163536 systemd[1]: Created slice user.slice - User and Session Slice. Apr 24 23:36:15.163545 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 24 23:36:15.163555 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 24 23:36:15.163564 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Apr 24 23:36:15.163573 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Apr 24 23:36:15.163587 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Apr 24 23:36:15.163598 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 24 23:36:15.163607 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Apr 24 23:36:15.163616 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 24 23:36:15.163624 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Apr 24 23:36:15.163633 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Apr 24 23:36:15.163642 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Apr 24 23:36:15.163651 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Apr 24 23:36:15.163662 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 24 23:36:15.163676 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 24 23:36:15.163686 systemd[1]: Reached target slices.target - Slice Units. Apr 24 23:36:15.163695 systemd[1]: Reached target swap.target - Swaps. Apr 24 23:36:15.163704 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Apr 24 23:36:15.163713 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Apr 24 23:36:15.163722 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 24 23:36:15.163731 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 24 23:36:15.163739 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 24 23:36:15.163750 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Apr 24 23:36:15.163759 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Apr 24 23:36:15.163768 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Apr 24 23:36:15.163776 systemd[1]: Mounting media.mount - External Media Directory... Apr 24 23:36:15.163785 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 24 23:36:15.163794 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Apr 24 23:36:15.163803 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Apr 24 23:36:15.163812 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Apr 24 23:36:15.163821 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Apr 24 23:36:15.163832 systemd[1]: Reached target machines.target - Containers. Apr 24 23:36:15.163840 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Apr 24 23:36:15.163849 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 24 23:36:15.163858 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 24 23:36:15.163867 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Apr 24 23:36:15.163884 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 24 23:36:15.163893 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 24 23:36:15.163902 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 24 23:36:15.163913 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Apr 24 23:36:15.163921 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 24 23:36:15.163930 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Apr 24 23:36:15.163939 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Apr 24 23:36:15.163948 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Apr 24 23:36:15.163956 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Apr 24 23:36:15.163965 systemd[1]: Stopped systemd-fsck-usr.service. Apr 24 23:36:15.163976 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 24 23:36:15.163985 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 24 23:36:15.163994 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Apr 24 23:36:15.164003 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Apr 24 23:36:15.164012 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 24 23:36:15.164021 systemd[1]: verity-setup.service: Deactivated successfully. Apr 24 23:36:15.164029 kernel: loop: module loaded Apr 24 23:36:15.164039 systemd[1]: Stopped verity-setup.service. Apr 24 23:36:15.164048 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 24 23:36:15.164059 kernel: fuse: init (API version 7.39) Apr 24 23:36:15.164072 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Apr 24 23:36:15.164081 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Apr 24 23:36:15.164107 systemd-journald[1158]: Collecting audit messages is disabled. Apr 24 23:36:15.164126 systemd[1]: Mounted media.mount - External Media Directory. Apr 24 23:36:15.164135 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Apr 24 23:36:15.164144 systemd-journald[1158]: Journal started Apr 24 23:36:15.164160 systemd-journald[1158]: Runtime Journal (/run/log/journal/77bb39b3f7464047bc97ad12e8ae1e04) is 8.0M, max 76.3M, 68.3M free. Apr 24 23:36:14.823307 systemd[1]: Queued start job for default target multi-user.target. Apr 24 23:36:14.841585 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Apr 24 23:36:14.842659 systemd[1]: systemd-journald.service: Deactivated successfully. Apr 24 23:36:15.169438 systemd[1]: Started systemd-journald.service - Journal Service. Apr 24 23:36:15.181427 kernel: ACPI: bus type drm_connector registered Apr 24 23:36:15.172453 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Apr 24 23:36:15.172988 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Apr 24 23:36:15.175450 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Apr 24 23:36:15.176037 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 24 23:36:15.176669 systemd[1]: modprobe@configfs.service: Deactivated successfully. Apr 24 23:36:15.176808 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Apr 24 23:36:15.177457 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 24 23:36:15.177575 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 24 23:36:15.178434 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 24 23:36:15.178556 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 24 23:36:15.180545 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 24 23:36:15.180676 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 24 23:36:15.181287 systemd[1]: modprobe@fuse.service: Deactivated successfully. Apr 24 23:36:15.181429 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Apr 24 23:36:15.182290 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 24 23:36:15.182443 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 24 23:36:15.183042 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 24 23:36:15.183691 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Apr 24 23:36:15.184399 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Apr 24 23:36:15.197002 systemd[1]: Reached target network-pre.target - Preparation for Network. Apr 24 23:36:15.203542 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Apr 24 23:36:15.207481 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Apr 24 23:36:15.207855 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Apr 24 23:36:15.207888 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 24 23:36:15.210155 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Apr 24 23:36:15.215579 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Apr 24 23:36:15.222595 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Apr 24 23:36:15.223374 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 24 23:36:15.229585 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Apr 24 23:36:15.233529 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Apr 24 23:36:15.233932 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 24 23:36:15.241525 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Apr 24 23:36:15.242065 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 24 23:36:15.243803 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 24 23:36:15.246042 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Apr 24 23:36:15.248534 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 24 23:36:15.251302 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Apr 24 23:36:15.252796 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Apr 24 23:36:15.253343 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Apr 24 23:36:15.269258 systemd-journald[1158]: Time spent on flushing to /var/log/journal/77bb39b3f7464047bc97ad12e8ae1e04 is 24.165ms for 1185 entries. Apr 24 23:36:15.269258 systemd-journald[1158]: System Journal (/var/log/journal/77bb39b3f7464047bc97ad12e8ae1e04) is 8.0M, max 584.8M, 576.8M free. Apr 24 23:36:15.309450 systemd-journald[1158]: Received client request to flush runtime journal. Apr 24 23:36:15.309479 kernel: loop0: detected capacity change from 0 to 8 Apr 24 23:36:15.277601 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Apr 24 23:36:15.279684 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Apr 24 23:36:15.289538 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Apr 24 23:36:15.318446 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Apr 24 23:36:15.330510 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Apr 24 23:36:15.347190 systemd-tmpfiles[1203]: ACLs are not supported, ignoring. Apr 24 23:36:15.347206 systemd-tmpfiles[1203]: ACLs are not supported, ignoring. Apr 24 23:36:15.348452 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 24 23:36:15.352152 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Apr 24 23:36:15.355802 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Apr 24 23:36:15.356996 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 24 23:36:15.363475 kernel: loop1: detected capacity change from 0 to 217752 Apr 24 23:36:15.364730 systemd[1]: Starting systemd-sysusers.service - Create System Users... Apr 24 23:36:15.373458 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 24 23:36:15.376846 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Apr 24 23:36:15.413432 udevadm[1221]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Apr 24 23:36:15.417457 kernel: loop2: detected capacity change from 0 to 140768 Apr 24 23:36:15.428214 systemd[1]: Finished systemd-sysusers.service - Create System Users. Apr 24 23:36:15.437431 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 24 23:36:15.459696 systemd-tmpfiles[1225]: ACLs are not supported, ignoring. Apr 24 23:36:15.459710 systemd-tmpfiles[1225]: ACLs are not supported, ignoring. Apr 24 23:36:15.464833 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 24 23:36:15.476016 kernel: loop3: detected capacity change from 0 to 142488 Apr 24 23:36:15.528478 kernel: loop4: detected capacity change from 0 to 8 Apr 24 23:36:15.534651 kernel: loop5: detected capacity change from 0 to 217752 Apr 24 23:36:15.559903 kernel: loop6: detected capacity change from 0 to 140768 Apr 24 23:36:15.583446 kernel: loop7: detected capacity change from 0 to 142488 Apr 24 23:36:15.603328 (sd-merge)[1230]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Apr 24 23:36:15.604225 (sd-merge)[1230]: Merged extensions into '/usr'. Apr 24 23:36:15.610611 systemd[1]: Reloading requested from client PID 1202 ('systemd-sysext') (unit systemd-sysext.service)... Apr 24 23:36:15.610747 systemd[1]: Reloading... Apr 24 23:36:15.688444 zram_generator::config[1256]: No configuration found. Apr 24 23:36:15.704858 ldconfig[1197]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Apr 24 23:36:15.805296 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:36:15.841492 systemd[1]: Reloading finished in 230 ms. Apr 24 23:36:15.875528 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Apr 24 23:36:15.876386 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Apr 24 23:36:15.879559 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Apr 24 23:36:15.890557 systemd[1]: Starting ensure-sysext.service... Apr 24 23:36:15.892537 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 24 23:36:15.894537 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 24 23:36:15.919484 systemd[1]: Reloading requested from client PID 1300 ('systemctl') (unit ensure-sysext.service)... Apr 24 23:36:15.919517 systemd[1]: Reloading... Apr 24 23:36:15.922011 systemd-udevd[1302]: Using default interface naming scheme 'v255'. Apr 24 23:36:15.943260 systemd-tmpfiles[1301]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Apr 24 23:36:15.943810 systemd-tmpfiles[1301]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Apr 24 23:36:15.944707 systemd-tmpfiles[1301]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Apr 24 23:36:15.945009 systemd-tmpfiles[1301]: ACLs are not supported, ignoring. Apr 24 23:36:15.945116 systemd-tmpfiles[1301]: ACLs are not supported, ignoring. Apr 24 23:36:15.948277 systemd-tmpfiles[1301]: Detected autofs mount point /boot during canonicalization of boot. Apr 24 23:36:15.948348 systemd-tmpfiles[1301]: Skipping /boot Apr 24 23:36:15.964333 systemd-tmpfiles[1301]: Detected autofs mount point /boot during canonicalization of boot. Apr 24 23:36:15.964397 systemd-tmpfiles[1301]: Skipping /boot Apr 24 23:36:16.026432 zram_generator::config[1346]: No configuration found. Apr 24 23:36:16.146495 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Apr 24 23:36:16.153461 kernel: ACPI: button: Power Button [PWRF] Apr 24 23:36:16.167338 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:36:16.198568 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Apr 24 23:36:16.198824 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Apr 24 23:36:16.203429 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Apr 24 23:36:16.203636 kernel: mousedev: PS/2 mouse device common for all mice Apr 24 23:36:16.203651 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Apr 24 23:36:16.221434 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Apr 24 23:36:16.228312 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Apr 24 23:36:16.228957 systemd[1]: Reloading finished in 308 ms. Apr 24 23:36:16.256231 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 24 23:36:16.266262 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 24 23:36:16.276037 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 31 scanned by (udev-worker) (1329) Apr 24 23:36:16.320846 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 24 23:36:16.323671 kernel: EDAC MC: Ver: 3.0.0 Apr 24 23:36:16.329595 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 24 23:36:16.334255 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Apr 24 23:36:16.337513 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Apr 24 23:36:16.337553 kernel: Console: switching to colour dummy device 80x25 Apr 24 23:36:16.337567 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Apr 24 23:36:16.337754 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Apr 24 23:36:16.337768 kernel: [drm] features: -context_init Apr 24 23:36:16.338769 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 24 23:36:16.339950 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 24 23:36:16.341579 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 24 23:36:16.342599 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 24 23:36:16.343586 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 24 23:36:16.344579 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 24 23:36:16.346835 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Apr 24 23:36:16.350290 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 24 23:36:16.354474 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 24 23:36:16.355592 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Apr 24 23:36:16.357548 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:36:16.357799 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Apr 24 23:36:16.360368 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 24 23:36:16.360597 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 24 23:36:16.361022 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 24 23:36:16.361156 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 24 23:36:16.365289 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Apr 24 23:36:16.376736 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Apr 24 23:36:16.376831 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 24 23:36:16.381981 systemd[1]: Finished ensure-sysext.service. Apr 24 23:36:16.391883 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Apr 24 23:36:16.392218 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 24 23:36:16.392358 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 24 23:36:16.394901 kernel: [drm] number of scanouts: 1 Apr 24 23:36:16.398422 kernel: [drm] number of cap sets: 0 Apr 24 23:36:16.404189 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Apr 24 23:36:16.412511 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Apr 24 23:36:16.417804 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 24 23:36:16.417960 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 24 23:36:16.420808 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Apr 24 23:36:16.420333 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 24 23:36:16.429711 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Apr 24 23:36:16.429740 kernel: Console: switching to colour frame buffer device 160x50 Apr 24 23:36:16.434438 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Apr 24 23:36:16.442657 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 24 23:36:16.442981 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:36:16.450635 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:36:16.464695 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Apr 24 23:36:16.468053 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Apr 24 23:36:16.469151 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Apr 24 23:36:16.472436 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Apr 24 23:36:16.481961 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Apr 24 23:36:16.483765 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Apr 24 23:36:16.490561 systemd[1]: Starting systemd-update-done.service - Update is Completed... Apr 24 23:36:16.491879 systemd[1]: Started systemd-userdbd.service - User Database Manager. Apr 24 23:36:16.497265 augenrules[1464]: No rules Apr 24 23:36:16.505708 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 24 23:36:16.506774 lvm[1457]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 24 23:36:16.524804 systemd[1]: Finished systemd-update-done.service - Update is Completed. Apr 24 23:36:16.541496 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Apr 24 23:36:16.543365 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 24 23:36:16.553583 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Apr 24 23:36:16.565216 lvm[1476]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 24 23:36:16.597649 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Apr 24 23:36:16.599504 systemd[1]: Reached target time-set.target - System Time Set. Apr 24 23:36:16.600692 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:36:16.602480 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Apr 24 23:36:16.609104 systemd-networkd[1422]: lo: Link UP Apr 24 23:36:16.609329 systemd-networkd[1422]: lo: Gained carrier Apr 24 23:36:16.611059 systemd-resolved[1423]: Positive Trust Anchors: Apr 24 23:36:16.611075 systemd-resolved[1423]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 24 23:36:16.611099 systemd-resolved[1423]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 24 23:36:16.612196 systemd-networkd[1422]: Enumeration completed Apr 24 23:36:16.612440 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 24 23:36:16.612755 systemd-networkd[1422]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:36:16.612802 systemd-networkd[1422]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 24 23:36:16.613701 systemd-networkd[1422]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:36:16.613743 systemd-networkd[1422]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 24 23:36:16.614296 systemd-networkd[1422]: eth0: Link UP Apr 24 23:36:16.614339 systemd-networkd[1422]: eth0: Gained carrier Apr 24 23:36:16.614380 systemd-networkd[1422]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:36:16.616012 systemd-resolved[1423]: Using system hostname 'ci-4081-3-6-n-6f01bfed3c'. Apr 24 23:36:16.620556 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Apr 24 23:36:16.620711 systemd-networkd[1422]: eth1: Link UP Apr 24 23:36:16.620716 systemd-networkd[1422]: eth1: Gained carrier Apr 24 23:36:16.620726 systemd-networkd[1422]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:36:16.621219 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 24 23:36:16.621327 systemd[1]: Reached target network.target - Network. Apr 24 23:36:16.621384 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 24 23:36:16.622850 systemd[1]: Reached target sysinit.target - System Initialization. Apr 24 23:36:16.623282 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Apr 24 23:36:16.623971 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Apr 24 23:36:16.625147 systemd[1]: Started logrotate.timer - Daily rotation of log files. Apr 24 23:36:16.625590 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Apr 24 23:36:16.625935 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Apr 24 23:36:16.626259 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Apr 24 23:36:16.626279 systemd[1]: Reached target paths.target - Path Units. Apr 24 23:36:16.627569 systemd[1]: Reached target timers.target - Timer Units. Apr 24 23:36:16.631110 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Apr 24 23:36:16.633004 systemd[1]: Starting docker.socket - Docker Socket for the API... Apr 24 23:36:16.645325 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Apr 24 23:36:16.647187 systemd[1]: Listening on docker.socket - Docker Socket for the API. Apr 24 23:36:16.648900 systemd[1]: Reached target sockets.target - Socket Units. Apr 24 23:36:16.649517 systemd[1]: Reached target basic.target - Basic System. Apr 24 23:36:16.649978 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Apr 24 23:36:16.650051 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Apr 24 23:36:16.651740 systemd[1]: Starting containerd.service - containerd container runtime... Apr 24 23:36:16.656748 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Apr 24 23:36:16.660304 systemd-networkd[1422]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Apr 24 23:36:16.662309 systemd-timesyncd[1435]: Network configuration changed, trying to establish connection. Apr 24 23:36:16.662864 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Apr 24 23:36:16.668925 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Apr 24 23:36:16.676731 jq[1489]: false Apr 24 23:36:16.677173 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Apr 24 23:36:16.679267 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Apr 24 23:36:16.685179 systemd-networkd[1422]: eth0: DHCPv4 address 65.21.181.31/32, gateway 172.31.1.1 acquired from 172.31.1.1 Apr 24 23:36:16.688605 systemd-timesyncd[1435]: Network configuration changed, trying to establish connection. Apr 24 23:36:16.689234 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Apr 24 23:36:16.693940 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Apr 24 23:36:16.698670 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Apr 24 23:36:16.704499 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Apr 24 23:36:16.715970 coreos-metadata[1487]: Apr 24 23:36:16.715 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Apr 24 23:36:16.711588 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Apr 24 23:36:16.721210 coreos-metadata[1487]: Apr 24 23:36:16.717 INFO Fetch successful Apr 24 23:36:16.721843 dbus-daemon[1488]: [system] SELinux support is enabled Apr 24 23:36:16.724541 systemd[1]: Starting systemd-logind.service - User Login Management... Apr 24 23:36:16.729746 coreos-metadata[1487]: Apr 24 23:36:16.722 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Apr 24 23:36:16.725280 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Apr 24 23:36:16.725690 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Apr 24 23:36:16.730141 systemd[1]: Starting update-engine.service - Update Engine... Apr 24 23:36:16.733119 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Apr 24 23:36:16.740741 jq[1504]: true Apr 24 23:36:16.735236 systemd[1]: Started dbus.service - D-Bus System Message Bus. Apr 24 23:36:16.746482 coreos-metadata[1487]: Apr 24 23:36:16.743 INFO Fetch successful Apr 24 23:36:16.746945 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Apr 24 23:36:16.747653 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Apr 24 23:36:16.749644 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Apr 24 23:36:16.749798 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Apr 24 23:36:16.762463 extend-filesystems[1492]: Found loop4 Apr 24 23:36:16.762463 extend-filesystems[1492]: Found loop5 Apr 24 23:36:16.762463 extend-filesystems[1492]: Found loop6 Apr 24 23:36:16.762463 extend-filesystems[1492]: Found loop7 Apr 24 23:36:16.762463 extend-filesystems[1492]: Found sda Apr 24 23:36:16.762463 extend-filesystems[1492]: Found sda1 Apr 24 23:36:16.762463 extend-filesystems[1492]: Found sda2 Apr 24 23:36:16.762463 extend-filesystems[1492]: Found sda3 Apr 24 23:36:16.762463 extend-filesystems[1492]: Found usr Apr 24 23:36:16.762463 extend-filesystems[1492]: Found sda4 Apr 24 23:36:16.807512 extend-filesystems[1492]: Found sda6 Apr 24 23:36:16.807512 extend-filesystems[1492]: Found sda7 Apr 24 23:36:16.807512 extend-filesystems[1492]: Found sda9 Apr 24 23:36:16.807512 extend-filesystems[1492]: Checking size of /dev/sda9 Apr 24 23:36:16.771929 systemd[1]: motdgen.service: Deactivated successfully. Apr 24 23:36:16.773841 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Apr 24 23:36:16.827013 update_engine[1503]: I20260424 23:36:16.815099 1503 main.cc:92] Flatcar Update Engine starting Apr 24 23:36:16.797941 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Apr 24 23:36:16.829686 jq[1514]: true Apr 24 23:36:16.797984 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Apr 24 23:36:16.807085 (ntainerd)[1519]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Apr 24 23:36:16.814739 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Apr 24 23:36:16.814759 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Apr 24 23:36:16.833215 systemd[1]: Started update-engine.service - Update Engine. Apr 24 23:36:16.837627 extend-filesystems[1492]: Resized partition /dev/sda9 Apr 24 23:36:16.840108 update_engine[1503]: I20260424 23:36:16.833552 1503 update_check_scheduler.cc:74] Next update check in 2m52s Apr 24 23:36:16.840130 tar[1512]: linux-amd64/LICENSE Apr 24 23:36:16.847532 tar[1512]: linux-amd64/helm Apr 24 23:36:16.843564 systemd[1]: Started locksmithd.service - Cluster reboot manager. Apr 24 23:36:16.847597 extend-filesystems[1534]: resize2fs 1.47.1 (20-May-2024) Apr 24 23:36:16.858452 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 19393531 blocks Apr 24 23:36:16.896846 systemd-logind[1501]: New seat seat0. Apr 24 23:36:16.903113 systemd-logind[1501]: Watching system buttons on /dev/input/event2 (Power Button) Apr 24 23:36:16.904936 systemd-logind[1501]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Apr 24 23:36:16.905090 systemd[1]: Started systemd-logind.service - User Login Management. Apr 24 23:36:16.950339 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Apr 24 23:36:16.969185 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 31 scanned by (udev-worker) (1313) Apr 24 23:36:16.951077 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Apr 24 23:36:17.009155 bash[1553]: Updated "/home/core/.ssh/authorized_keys" Apr 24 23:36:17.013683 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Apr 24 23:36:17.040593 systemd[1]: Starting sshkeys.service... Apr 24 23:36:17.066184 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Apr 24 23:36:17.071681 containerd[1519]: time="2026-04-24T23:36:17.071153880Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Apr 24 23:36:17.075256 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Apr 24 23:36:17.123367 coreos-metadata[1564]: Apr 24 23:36:17.122 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Apr 24 23:36:17.124434 coreos-metadata[1564]: Apr 24 23:36:17.124 INFO Fetch successful Apr 24 23:36:17.127586 containerd[1519]: time="2026-04-24T23:36:17.126228516Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:36:17.127980 containerd[1519]: time="2026-04-24T23:36:17.127959467Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:36:17.129807 containerd[1519]: time="2026-04-24T23:36:17.128005617Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Apr 24 23:36:17.129807 containerd[1519]: time="2026-04-24T23:36:17.128018567Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Apr 24 23:36:17.131747 containerd[1519]: time="2026-04-24T23:36:17.131709290Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Apr 24 23:36:17.131782 containerd[1519]: time="2026-04-24T23:36:17.131756880Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Apr 24 23:36:17.131838 containerd[1519]: time="2026-04-24T23:36:17.131818840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:36:17.131838 containerd[1519]: time="2026-04-24T23:36:17.131836290Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:36:17.132037 containerd[1519]: time="2026-04-24T23:36:17.132020281Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:36:17.132037 containerd[1519]: time="2026-04-24T23:36:17.132035671Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Apr 24 23:36:17.132071 containerd[1519]: time="2026-04-24T23:36:17.132045401Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:36:17.132560 unknown[1564]: wrote ssh authorized keys file for user: core Apr 24 23:36:17.133562 containerd[1519]: time="2026-04-24T23:36:17.133533522Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Apr 24 23:36:17.135707 locksmithd[1533]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Apr 24 23:36:17.138942 containerd[1519]: time="2026-04-24T23:36:17.138226696Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:36:17.138942 containerd[1519]: time="2026-04-24T23:36:17.138720646Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:36:17.139486 containerd[1519]: time="2026-04-24T23:36:17.139150167Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:36:17.139486 containerd[1519]: time="2026-04-24T23:36:17.139166007Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Apr 24 23:36:17.139655 containerd[1519]: time="2026-04-24T23:36:17.139642977Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Apr 24 23:36:17.140126 containerd[1519]: time="2026-04-24T23:36:17.139915967Z" level=info msg="metadata content store policy set" policy=shared Apr 24 23:36:17.160997 containerd[1519]: time="2026-04-24T23:36:17.158293403Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Apr 24 23:36:17.160997 containerd[1519]: time="2026-04-24T23:36:17.158486623Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Apr 24 23:36:17.160997 containerd[1519]: time="2026-04-24T23:36:17.158874983Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Apr 24 23:36:17.160997 containerd[1519]: time="2026-04-24T23:36:17.158889823Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Apr 24 23:36:17.160997 containerd[1519]: time="2026-04-24T23:36:17.158910613Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Apr 24 23:36:17.161103 containerd[1519]: time="2026-04-24T23:36:17.161000555Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Apr 24 23:36:17.161579 containerd[1519]: time="2026-04-24T23:36:17.161561205Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Apr 24 23:36:17.161675 containerd[1519]: time="2026-04-24T23:36:17.161657065Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Apr 24 23:36:17.161716 containerd[1519]: time="2026-04-24T23:36:17.161675935Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Apr 24 23:36:17.161716 containerd[1519]: time="2026-04-24T23:36:17.161686955Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Apr 24 23:36:17.161716 containerd[1519]: time="2026-04-24T23:36:17.161697855Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Apr 24 23:36:17.161716 containerd[1519]: time="2026-04-24T23:36:17.161707155Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Apr 24 23:36:17.161716 containerd[1519]: time="2026-04-24T23:36:17.161716215Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Apr 24 23:36:17.161785 containerd[1519]: time="2026-04-24T23:36:17.161731125Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Apr 24 23:36:17.161785 containerd[1519]: time="2026-04-24T23:36:17.161742345Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Apr 24 23:36:17.161785 containerd[1519]: time="2026-04-24T23:36:17.161752145Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Apr 24 23:36:17.161785 containerd[1519]: time="2026-04-24T23:36:17.161761055Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Apr 24 23:36:17.161785 containerd[1519]: time="2026-04-24T23:36:17.161769395Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Apr 24 23:36:17.161785 containerd[1519]: time="2026-04-24T23:36:17.161784865Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161796145Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161806665Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161816355Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161825265Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161835115Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161843785Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161854125Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161864365Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161884016Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161892656Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161914906Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.161924 containerd[1519]: time="2026-04-24T23:36:17.161924006Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.161936306Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.161951786Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.161960736Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.161968246Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.162008336Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.162021016Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.162029176Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.162037396Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.162044346Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.162052966Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.162060036Z" level=info msg="NRI interface is disabled by configuration." Apr 24 23:36:17.162146 containerd[1519]: time="2026-04-24T23:36:17.162067346Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Apr 24 23:36:17.162461 containerd[1519]: time="2026-04-24T23:36:17.162252086Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Apr 24 23:36:17.162461 containerd[1519]: time="2026-04-24T23:36:17.162291186Z" level=info msg="Connect containerd service" Apr 24 23:36:17.162461 containerd[1519]: time="2026-04-24T23:36:17.162321956Z" level=info msg="using legacy CRI server" Apr 24 23:36:17.162461 containerd[1519]: time="2026-04-24T23:36:17.162327176Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Apr 24 23:36:17.162461 containerd[1519]: time="2026-04-24T23:36:17.162391446Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Apr 24 23:36:17.162887 containerd[1519]: time="2026-04-24T23:36:17.162847746Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 24 23:36:17.165256 containerd[1519]: time="2026-04-24T23:36:17.165029348Z" level=info msg="Start subscribing containerd event" Apr 24 23:36:17.165256 containerd[1519]: time="2026-04-24T23:36:17.165073508Z" level=info msg="Start recovering state" Apr 24 23:36:17.165256 containerd[1519]: time="2026-04-24T23:36:17.165118158Z" level=info msg="Start event monitor" Apr 24 23:36:17.165256 containerd[1519]: time="2026-04-24T23:36:17.165125968Z" level=info msg="Start snapshots syncer" Apr 24 23:36:17.165256 containerd[1519]: time="2026-04-24T23:36:17.165133798Z" level=info msg="Start cni network conf syncer for default" Apr 24 23:36:17.165256 containerd[1519]: time="2026-04-24T23:36:17.165139968Z" level=info msg="Start streaming server" Apr 24 23:36:17.166143 containerd[1519]: time="2026-04-24T23:36:17.165376688Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Apr 24 23:36:17.166143 containerd[1519]: time="2026-04-24T23:36:17.165490109Z" level=info msg=serving... address=/run/containerd/containerd.sock Apr 24 23:36:17.165667 systemd[1]: Started containerd.service - containerd container runtime. Apr 24 23:36:17.167609 containerd[1519]: time="2026-04-24T23:36:17.167589020Z" level=info msg="containerd successfully booted in 0.098249s" Apr 24 23:36:17.181994 sshd_keygen[1513]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Apr 24 23:36:17.186687 update-ssh-keys[1575]: Updated "/home/core/.ssh/authorized_keys" Apr 24 23:36:17.187470 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Apr 24 23:36:17.193348 systemd[1]: Finished sshkeys.service. Apr 24 23:36:17.212799 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Apr 24 23:36:17.214427 kernel: EXT4-fs (sda9): resized filesystem to 19393531 Apr 24 23:36:17.220669 systemd[1]: Starting issuegen.service - Generate /run/issue... Apr 24 23:36:17.227813 systemd[1]: issuegen.service: Deactivated successfully. Apr 24 23:36:17.227994 systemd[1]: Finished issuegen.service - Generate /run/issue. Apr 24 23:36:17.237609 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Apr 24 23:36:17.246575 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Apr 24 23:36:17.250885 extend-filesystems[1534]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Apr 24 23:36:17.250885 extend-filesystems[1534]: old_desc_blocks = 1, new_desc_blocks = 10 Apr 24 23:36:17.250885 extend-filesystems[1534]: The filesystem on /dev/sda9 is now 19393531 (4k) blocks long. Apr 24 23:36:17.255514 extend-filesystems[1492]: Resized filesystem in /dev/sda9 Apr 24 23:36:17.255514 extend-filesystems[1492]: Found sr0 Apr 24 23:36:17.252121 systemd[1]: extend-filesystems.service: Deactivated successfully. Apr 24 23:36:17.254006 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Apr 24 23:36:17.266851 systemd[1]: Started getty@tty1.service - Getty on tty1. Apr 24 23:36:17.275671 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Apr 24 23:36:17.276970 systemd[1]: Reached target getty.target - Login Prompts. Apr 24 23:36:17.468271 tar[1512]: linux-amd64/README.md Apr 24 23:36:17.478847 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Apr 24 23:36:18.025783 systemd-networkd[1422]: eth1: Gained IPv6LL Apr 24 23:36:18.027938 systemd-timesyncd[1435]: Network configuration changed, trying to establish connection. Apr 24 23:36:18.031725 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Apr 24 23:36:18.034935 systemd[1]: Reached target network-online.target - Network is Online. Apr 24 23:36:18.043771 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:36:18.063132 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Apr 24 23:36:18.091888 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Apr 24 23:36:18.537738 systemd-networkd[1422]: eth0: Gained IPv6LL Apr 24 23:36:18.539508 systemd-timesyncd[1435]: Network configuration changed, trying to establish connection. Apr 24 23:36:18.954774 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:36:18.956802 systemd[1]: Reached target multi-user.target - Multi-User System. Apr 24 23:36:18.958139 systemd[1]: Startup finished in 1.534s (kernel) + 5.557s (initrd) + 4.719s (userspace) = 11.811s. Apr 24 23:36:18.962840 (kubelet)[1615]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 24 23:36:19.507888 kubelet[1615]: E0424 23:36:19.507762 1615 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 24 23:36:19.511268 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 24 23:36:19.511671 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 24 23:36:21.842932 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Apr 24 23:36:21.843946 systemd[1]: Started sshd@0-65.21.181.31:22-4.175.71.9:32892.service - OpenSSH per-connection server daemon (4.175.71.9:32892). Apr 24 23:36:22.064764 sshd[1628]: Accepted publickey for core from 4.175.71.9 port 32892 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:36:22.067195 sshd[1628]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:36:22.075055 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Apr 24 23:36:22.087036 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Apr 24 23:36:22.088880 systemd-logind[1501]: New session 1 of user core. Apr 24 23:36:22.098110 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Apr 24 23:36:22.101702 systemd[1]: Starting user@500.service - User Manager for UID 500... Apr 24 23:36:22.114139 (systemd)[1632]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Apr 24 23:36:22.206908 systemd[1632]: Queued start job for default target default.target. Apr 24 23:36:22.216573 systemd[1632]: Created slice app.slice - User Application Slice. Apr 24 23:36:22.216600 systemd[1632]: Reached target paths.target - Paths. Apr 24 23:36:22.216614 systemd[1632]: Reached target timers.target - Timers. Apr 24 23:36:22.217924 systemd[1632]: Starting dbus.socket - D-Bus User Message Bus Socket... Apr 24 23:36:22.230291 systemd[1632]: Listening on dbus.socket - D-Bus User Message Bus Socket. Apr 24 23:36:22.230391 systemd[1632]: Reached target sockets.target - Sockets. Apr 24 23:36:22.230402 systemd[1632]: Reached target basic.target - Basic System. Apr 24 23:36:22.230457 systemd[1632]: Reached target default.target - Main User Target. Apr 24 23:36:22.230493 systemd[1632]: Startup finished in 110ms. Apr 24 23:36:22.230769 systemd[1]: Started user@500.service - User Manager for UID 500. Apr 24 23:36:22.237550 systemd[1]: Started session-1.scope - Session 1 of User core. Apr 24 23:36:22.421806 systemd[1]: Started sshd@1-65.21.181.31:22-4.175.71.9:32900.service - OpenSSH per-connection server daemon (4.175.71.9:32900). Apr 24 23:36:22.623837 sshd[1643]: Accepted publickey for core from 4.175.71.9 port 32900 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:36:22.625530 sshd[1643]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:36:22.629774 systemd-logind[1501]: New session 2 of user core. Apr 24 23:36:22.638555 systemd[1]: Started session-2.scope - Session 2 of User core. Apr 24 23:36:22.794917 sshd[1643]: pam_unix(sshd:session): session closed for user core Apr 24 23:36:22.799872 systemd[1]: sshd@1-65.21.181.31:22-4.175.71.9:32900.service: Deactivated successfully. Apr 24 23:36:22.801533 systemd[1]: session-2.scope: Deactivated successfully. Apr 24 23:36:22.802405 systemd-logind[1501]: Session 2 logged out. Waiting for processes to exit. Apr 24 23:36:22.803344 systemd-logind[1501]: Removed session 2. Apr 24 23:36:22.838872 systemd[1]: Started sshd@2-65.21.181.31:22-4.175.71.9:32906.service - OpenSSH per-connection server daemon (4.175.71.9:32906). Apr 24 23:36:23.043484 sshd[1650]: Accepted publickey for core from 4.175.71.9 port 32906 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:36:23.044909 sshd[1650]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:36:23.050090 systemd-logind[1501]: New session 3 of user core. Apr 24 23:36:23.061637 systemd[1]: Started session-3.scope - Session 3 of User core. Apr 24 23:36:23.208387 sshd[1650]: pam_unix(sshd:session): session closed for user core Apr 24 23:36:23.214918 systemd[1]: sshd@2-65.21.181.31:22-4.175.71.9:32906.service: Deactivated successfully. Apr 24 23:36:23.217851 systemd[1]: session-3.scope: Deactivated successfully. Apr 24 23:36:23.219064 systemd-logind[1501]: Session 3 logged out. Waiting for processes to exit. Apr 24 23:36:23.220799 systemd-logind[1501]: Removed session 3. Apr 24 23:36:23.257873 systemd[1]: Started sshd@3-65.21.181.31:22-4.175.71.9:32922.service - OpenSSH per-connection server daemon (4.175.71.9:32922). Apr 24 23:36:23.479124 sshd[1657]: Accepted publickey for core from 4.175.71.9 port 32922 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:36:23.480670 sshd[1657]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:36:23.486311 systemd-logind[1501]: New session 4 of user core. Apr 24 23:36:23.500590 systemd[1]: Started session-4.scope - Session 4 of User core. Apr 24 23:36:23.654556 sshd[1657]: pam_unix(sshd:session): session closed for user core Apr 24 23:36:23.661997 systemd-logind[1501]: Session 4 logged out. Waiting for processes to exit. Apr 24 23:36:23.662925 systemd[1]: sshd@3-65.21.181.31:22-4.175.71.9:32922.service: Deactivated successfully. Apr 24 23:36:23.667166 systemd[1]: session-4.scope: Deactivated successfully. Apr 24 23:36:23.669011 systemd-logind[1501]: Removed session 4. Apr 24 23:36:23.703814 systemd[1]: Started sshd@4-65.21.181.31:22-4.175.71.9:32936.service - OpenSSH per-connection server daemon (4.175.71.9:32936). Apr 24 23:36:23.937341 sshd[1664]: Accepted publickey for core from 4.175.71.9 port 32936 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:36:23.938745 sshd[1664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:36:23.947506 systemd-logind[1501]: New session 5 of user core. Apr 24 23:36:23.954683 systemd[1]: Started session-5.scope - Session 5 of User core. Apr 24 23:36:24.087402 sudo[1667]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Apr 24 23:36:24.087724 sudo[1667]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:36:24.100870 sudo[1667]: pam_unix(sudo:session): session closed for user root Apr 24 23:36:24.131932 sshd[1664]: pam_unix(sshd:session): session closed for user core Apr 24 23:36:24.137838 systemd[1]: sshd@4-65.21.181.31:22-4.175.71.9:32936.service: Deactivated successfully. Apr 24 23:36:24.141291 systemd[1]: session-5.scope: Deactivated successfully. Apr 24 23:36:24.143737 systemd-logind[1501]: Session 5 logged out. Waiting for processes to exit. Apr 24 23:36:24.145559 systemd-logind[1501]: Removed session 5. Apr 24 23:36:24.176790 systemd[1]: Started sshd@5-65.21.181.31:22-4.175.71.9:32942.service - OpenSSH per-connection server daemon (4.175.71.9:32942). Apr 24 23:36:24.410148 sshd[1672]: Accepted publickey for core from 4.175.71.9 port 32942 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:36:24.412894 sshd[1672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:36:24.420519 systemd-logind[1501]: New session 6 of user core. Apr 24 23:36:24.430705 systemd[1]: Started session-6.scope - Session 6 of User core. Apr 24 23:36:24.556954 sudo[1676]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Apr 24 23:36:24.557912 sudo[1676]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:36:24.566067 sudo[1676]: pam_unix(sudo:session): session closed for user root Apr 24 23:36:24.579052 sudo[1675]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Apr 24 23:36:24.579811 sudo[1675]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:36:24.606847 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Apr 24 23:36:24.611474 auditctl[1679]: No rules Apr 24 23:36:24.613576 systemd[1]: audit-rules.service: Deactivated successfully. Apr 24 23:36:24.614047 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Apr 24 23:36:24.623392 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 24 23:36:24.671594 augenrules[1697]: No rules Apr 24 23:36:24.674548 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 24 23:36:24.676855 sudo[1675]: pam_unix(sudo:session): session closed for user root Apr 24 23:36:24.709785 sshd[1672]: pam_unix(sshd:session): session closed for user core Apr 24 23:36:24.715047 systemd-logind[1501]: Session 6 logged out. Waiting for processes to exit. Apr 24 23:36:24.715168 systemd[1]: sshd@5-65.21.181.31:22-4.175.71.9:32942.service: Deactivated successfully. Apr 24 23:36:24.717509 systemd[1]: session-6.scope: Deactivated successfully. Apr 24 23:36:24.719139 systemd-logind[1501]: Removed session 6. Apr 24 23:36:24.750147 systemd[1]: Started sshd@6-65.21.181.31:22-4.175.71.9:32948.service - OpenSSH per-connection server daemon (4.175.71.9:32948). Apr 24 23:36:24.987532 sshd[1705]: Accepted publickey for core from 4.175.71.9 port 32948 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:36:24.989937 sshd[1705]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:36:24.998097 systemd-logind[1501]: New session 7 of user core. Apr 24 23:36:25.009648 systemd[1]: Started session-7.scope - Session 7 of User core. Apr 24 23:36:25.133912 sudo[1708]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Apr 24 23:36:25.134680 sudo[1708]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:36:25.419587 systemd[1]: Starting docker.service - Docker Application Container Engine... Apr 24 23:36:25.436053 (dockerd)[1723]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Apr 24 23:36:25.662004 dockerd[1723]: time="2026-04-24T23:36:25.661920847Z" level=info msg="Starting up" Apr 24 23:36:25.770722 dockerd[1723]: time="2026-04-24T23:36:25.770386957Z" level=info msg="Loading containers: start." Apr 24 23:36:25.868461 kernel: Initializing XFRM netlink socket Apr 24 23:36:25.897672 systemd-timesyncd[1435]: Network configuration changed, trying to establish connection. Apr 24 23:36:25.905060 systemd-timesyncd[1435]: Network configuration changed, trying to establish connection. Apr 24 23:36:25.943317 systemd-networkd[1422]: docker0: Link UP Apr 24 23:36:25.943581 systemd-timesyncd[1435]: Network configuration changed, trying to establish connection. Apr 24 23:36:25.955677 dockerd[1723]: time="2026-04-24T23:36:25.955639151Z" level=info msg="Loading containers: done." Apr 24 23:36:25.966592 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1905809669-merged.mount: Deactivated successfully. Apr 24 23:36:25.970058 dockerd[1723]: time="2026-04-24T23:36:25.970018223Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Apr 24 23:36:25.970124 dockerd[1723]: time="2026-04-24T23:36:25.970098593Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Apr 24 23:36:25.970197 dockerd[1723]: time="2026-04-24T23:36:25.970180253Z" level=info msg="Daemon has completed initialization" Apr 24 23:36:26.002264 dockerd[1723]: time="2026-04-24T23:36:26.002194990Z" level=info msg="API listen on /run/docker.sock" Apr 24 23:36:26.002537 systemd[1]: Started docker.service - Docker Application Container Engine. Apr 24 23:36:26.470451 containerd[1519]: time="2026-04-24T23:36:26.470114180Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.4\"" Apr 24 23:36:27.044666 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2679937715.mount: Deactivated successfully. Apr 24 23:36:28.222504 containerd[1519]: time="2026-04-24T23:36:28.222441830Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:28.223689 containerd[1519]: time="2026-04-24T23:36:28.223469291Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.4: active requests=0, bytes read=27579523" Apr 24 23:36:28.225996 containerd[1519]: time="2026-04-24T23:36:28.224627832Z" level=info msg="ImageCreate event name:\"sha256:840f22aa169cc9a11114a874832f60c2d4a4f7767d107303cd1ca6d9c228ee8b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:28.227229 containerd[1519]: time="2026-04-24T23:36:28.226962053Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:06b4bb208634a107ab9e6c50cdb9df178d05166a700c0cc448d59522091074b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:28.227765 containerd[1519]: time="2026-04-24T23:36:28.227736984Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.4\" with image id \"sha256:840f22aa169cc9a11114a874832f60c2d4a4f7767d107303cd1ca6d9c228ee8b\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:06b4bb208634a107ab9e6c50cdb9df178d05166a700c0cc448d59522091074b5\", size \"27576022\" in 1.757591784s" Apr 24 23:36:28.227805 containerd[1519]: time="2026-04-24T23:36:28.227768454Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.4\" returns image reference \"sha256:840f22aa169cc9a11114a874832f60c2d4a4f7767d107303cd1ca6d9c228ee8b\"" Apr 24 23:36:28.228551 containerd[1519]: time="2026-04-24T23:36:28.228512815Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.4\"" Apr 24 23:36:29.320143 containerd[1519]: time="2026-04-24T23:36:29.320101624Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:29.321069 containerd[1519]: time="2026-04-24T23:36:29.320941255Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.4: active requests=0, bytes read=21451681" Apr 24 23:36:29.323431 containerd[1519]: time="2026-04-24T23:36:29.321852566Z" level=info msg="ImageCreate event name:\"sha256:96ce7469899d4d3ccad56b1a80b91609cb2203287112d73818296004948bb667\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:29.325783 containerd[1519]: time="2026-04-24T23:36:29.325765019Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7b036c805d57f203e9efaf43672cff6019b9083a9c0eb107ea8500eace29d8fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:29.326380 containerd[1519]: time="2026-04-24T23:36:29.326358499Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.4\" with image id \"sha256:96ce7469899d4d3ccad56b1a80b91609cb2203287112d73818296004948bb667\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7b036c805d57f203e9efaf43672cff6019b9083a9c0eb107ea8500eace29d8fd\", size \"23018006\" in 1.097816514s" Apr 24 23:36:29.326429 containerd[1519]: time="2026-04-24T23:36:29.326382669Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.4\" returns image reference \"sha256:96ce7469899d4d3ccad56b1a80b91609cb2203287112d73818296004948bb667\"" Apr 24 23:36:29.326985 containerd[1519]: time="2026-04-24T23:36:29.326922630Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.4\"" Apr 24 23:36:29.761995 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Apr 24 23:36:29.771091 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:36:29.968670 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:36:29.970248 (kubelet)[1934]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 24 23:36:30.015136 kubelet[1934]: E0424 23:36:30.014774 1934 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 24 23:36:30.022046 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 24 23:36:30.022857 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 24 23:36:30.338459 containerd[1519]: time="2026-04-24T23:36:30.338221292Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:30.339367 containerd[1519]: time="2026-04-24T23:36:30.339178553Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.4: active requests=0, bytes read=15555312" Apr 24 23:36:30.340336 containerd[1519]: time="2026-04-24T23:36:30.340068144Z" level=info msg="ImageCreate event name:\"sha256:a0eecd9b69a38f829c29b535f73c1a3de3c7cc9f1294a44dc42c808faf0a23ff\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:30.341927 containerd[1519]: time="2026-04-24T23:36:30.341900075Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9054fecb4fa04cc63aec47b0913c8deb3487d414190cd15211f864cfe0d0b4d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:30.343365 containerd[1519]: time="2026-04-24T23:36:30.342681056Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.4\" with image id \"sha256:a0eecd9b69a38f829c29b535f73c1a3de3c7cc9f1294a44dc42c808faf0a23ff\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9054fecb4fa04cc63aec47b0913c8deb3487d414190cd15211f864cfe0d0b4d6\", size \"17121655\" in 1.015690356s" Apr 24 23:36:30.343365 containerd[1519]: time="2026-04-24T23:36:30.342703686Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.4\" returns image reference \"sha256:a0eecd9b69a38f829c29b535f73c1a3de3c7cc9f1294a44dc42c808faf0a23ff\"" Apr 24 23:36:30.343466 containerd[1519]: time="2026-04-24T23:36:30.343451137Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.4\"" Apr 24 23:36:31.375250 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3723239012.mount: Deactivated successfully. Apr 24 23:36:31.586238 containerd[1519]: time="2026-04-24T23:36:31.586161652Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:31.587660 containerd[1519]: time="2026-04-24T23:36:31.587528473Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.4: active requests=0, bytes read=25699953" Apr 24 23:36:31.588964 containerd[1519]: time="2026-04-24T23:36:31.588629094Z" level=info msg="ImageCreate event name:\"sha256:f21f27cddb23d0d7131dc7c59666b3b0e0b5ca4c3f003225f90307ab6211b6e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:31.590251 containerd[1519]: time="2026-04-24T23:36:31.590222195Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:c5daa23c72474e5e4062c320177d3b485fd42e7010f052bc80d657c4c00a0672\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:31.590931 containerd[1519]: time="2026-04-24T23:36:31.590647316Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.4\" with image id \"sha256:f21f27cddb23d0d7131dc7c59666b3b0e0b5ca4c3f003225f90307ab6211b6e1\", repo tag \"registry.k8s.io/kube-proxy:v1.35.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:c5daa23c72474e5e4062c320177d3b485fd42e7010f052bc80d657c4c00a0672\", size \"25698944\" in 1.247175969s" Apr 24 23:36:31.590931 containerd[1519]: time="2026-04-24T23:36:31.590671796Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.4\" returns image reference \"sha256:f21f27cddb23d0d7131dc7c59666b3b0e0b5ca4c3f003225f90307ab6211b6e1\"" Apr 24 23:36:31.591293 containerd[1519]: time="2026-04-24T23:36:31.591271446Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Apr 24 23:36:32.134701 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3626024.mount: Deactivated successfully. Apr 24 23:36:33.150249 containerd[1519]: time="2026-04-24T23:36:33.149483854Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:33.150249 containerd[1519]: time="2026-04-24T23:36:33.150231545Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=23556642" Apr 24 23:36:33.150963 containerd[1519]: time="2026-04-24T23:36:33.150904385Z" level=info msg="ImageCreate event name:\"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:33.152777 containerd[1519]: time="2026-04-24T23:36:33.152559577Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:33.153787 containerd[1519]: time="2026-04-24T23:36:33.153337647Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"23553139\" in 1.562044891s" Apr 24 23:36:33.153787 containerd[1519]: time="2026-04-24T23:36:33.153361317Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\"" Apr 24 23:36:33.154058 containerd[1519]: time="2026-04-24T23:36:33.154027498Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Apr 24 23:36:33.613038 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2711004546.mount: Deactivated successfully. Apr 24 23:36:33.620953 containerd[1519]: time="2026-04-24T23:36:33.620906507Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:33.622032 containerd[1519]: time="2026-04-24T23:36:33.621983818Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321240" Apr 24 23:36:33.623546 containerd[1519]: time="2026-04-24T23:36:33.623033729Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:33.626436 containerd[1519]: time="2026-04-24T23:36:33.625546221Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:33.627697 containerd[1519]: time="2026-04-24T23:36:33.627102742Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 473.045554ms" Apr 24 23:36:33.627697 containerd[1519]: time="2026-04-24T23:36:33.627153502Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Apr 24 23:36:33.627881 containerd[1519]: time="2026-04-24T23:36:33.627847823Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Apr 24 23:36:34.157080 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3949707785.mount: Deactivated successfully. Apr 24 23:36:34.800134 containerd[1519]: time="2026-04-24T23:36:34.800079699Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:34.801436 containerd[1519]: time="2026-04-24T23:36:34.801227260Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=23644553" Apr 24 23:36:34.802465 containerd[1519]: time="2026-04-24T23:36:34.802421971Z" level=info msg="ImageCreate event name:\"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:34.805025 containerd[1519]: time="2026-04-24T23:36:34.804881993Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:34.806595 containerd[1519]: time="2026-04-24T23:36:34.805549394Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"23641797\" in 1.177664411s" Apr 24 23:36:34.806595 containerd[1519]: time="2026-04-24T23:36:34.805576174Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\"" Apr 24 23:36:35.950781 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:36:35.962765 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:36:36.002548 systemd[1]: Reloading requested from client PID 2100 ('systemctl') (unit session-7.scope)... Apr 24 23:36:36.002560 systemd[1]: Reloading... Apr 24 23:36:36.107156 zram_generator::config[2136]: No configuration found. Apr 24 23:36:36.193257 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:36:36.253534 systemd[1]: Reloading finished in 250 ms. Apr 24 23:36:36.299156 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Apr 24 23:36:36.299329 systemd[1]: kubelet.service: Failed with result 'signal'. Apr 24 23:36:36.299637 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:36:36.301901 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:36:36.439100 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:36:36.447693 (kubelet)[2195]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 24 23:36:36.497643 kubelet[2195]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:36:36.609897 kubelet[2195]: I0424 23:36:36.609721 2195 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Apr 24 23:36:36.609897 kubelet[2195]: I0424 23:36:36.609756 2195 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:36:36.609897 kubelet[2195]: I0424 23:36:36.609772 2195 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 24 23:36:36.609897 kubelet[2195]: I0424 23:36:36.609777 2195 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:36:36.610152 kubelet[2195]: I0424 23:36:36.609947 2195 server.go:951] "Client rotation is on, will bootstrap in background" Apr 24 23:36:36.616350 kubelet[2195]: I0424 23:36:36.615957 2195 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 24 23:36:36.616547 kubelet[2195]: E0424 23:36:36.616517 2195 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://65.21.181.31:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 65.21.181.31:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Apr 24 23:36:36.619677 kubelet[2195]: E0424 23:36:36.619623 2195 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 24 23:36:36.619677 kubelet[2195]: I0424 23:36:36.619662 2195 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Apr 24 23:36:36.622693 kubelet[2195]: I0424 23:36:36.622676 2195 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 24 23:36:36.623481 kubelet[2195]: I0424 23:36:36.623438 2195 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:36:36.623562 kubelet[2195]: I0424 23:36:36.623457 2195 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-6f01bfed3c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:36:36.623562 kubelet[2195]: I0424 23:36:36.623562 2195 topology_manager.go:143] "Creating topology manager with none policy" Apr 24 23:36:36.623717 kubelet[2195]: I0424 23:36:36.623568 2195 container_manager_linux.go:308] "Creating device plugin manager" Apr 24 23:36:36.623717 kubelet[2195]: I0424 23:36:36.623633 2195 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Apr 24 23:36:36.625708 kubelet[2195]: I0424 23:36:36.625679 2195 state_mem.go:41] "Initialized" logger="CPUManager state memory" Apr 24 23:36:36.626147 kubelet[2195]: I0424 23:36:36.625814 2195 kubelet.go:482] "Attempting to sync node with API server" Apr 24 23:36:36.626147 kubelet[2195]: I0424 23:36:36.625827 2195 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:36:36.626147 kubelet[2195]: I0424 23:36:36.625935 2195 kubelet.go:394] "Adding apiserver pod source" Apr 24 23:36:36.626147 kubelet[2195]: I0424 23:36:36.625943 2195 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:36:36.627969 kubelet[2195]: I0424 23:36:36.627943 2195 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 24 23:36:36.629918 kubelet[2195]: I0424 23:36:36.629892 2195 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:36:36.629918 kubelet[2195]: I0424 23:36:36.629916 2195 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 24 23:36:36.630026 kubelet[2195]: W0424 23:36:36.629962 2195 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Apr 24 23:36:36.635348 kubelet[2195]: I0424 23:36:36.635313 2195 server.go:1257] "Started kubelet" Apr 24 23:36:36.642446 kubelet[2195]: I0424 23:36:36.640995 2195 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Apr 24 23:36:36.648156 kubelet[2195]: I0424 23:36:36.648101 2195 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:36:36.649844 kubelet[2195]: I0424 23:36:36.649819 2195 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:36:36.650021 kubelet[2195]: I0424 23:36:36.649991 2195 volume_manager.go:311] "Starting Kubelet Volume Manager" Apr 24 23:36:36.650153 kubelet[2195]: E0424 23:36:36.650128 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:36.650344 kubelet[2195]: I0424 23:36:36.650320 2195 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 24 23:36:36.650392 kubelet[2195]: I0424 23:36:36.650363 2195 reconciler.go:29] "Reconciler: start to sync state" Apr 24 23:36:36.655472 kubelet[2195]: I0424 23:36:36.654834 2195 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 24 23:36:36.657725 kubelet[2195]: I0424 23:36:36.656839 2195 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:36:36.657725 kubelet[2195]: I0424 23:36:36.656935 2195 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 24 23:36:36.657725 kubelet[2195]: I0424 23:36:36.657265 2195 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:36:36.660244 kubelet[2195]: I0424 23:36:36.648223 2195 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 24 23:36:36.662237 kubelet[2195]: E0424 23:36:36.660794 2195 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://65.21.181.31:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-6f01bfed3c?timeout=10s\": dial tcp 65.21.181.31:6443: connect: connection refused" interval="200ms" Apr 24 23:36:36.662237 kubelet[2195]: E0424 23:36:36.661025 2195 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://65.21.181.31:6443/api/v1/namespaces/default/events\": dial tcp 65.21.181.31:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-n-6f01bfed3c.18a96f33e419ec68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-n-6f01bfed3c,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-6f01bfed3c,},FirstTimestamp:2026-04-24 23:36:36.635290728 +0000 UTC m=+0.178705830,LastTimestamp:2026-04-24 23:36:36.635290728 +0000 UTC m=+0.178705830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-6f01bfed3c,}" Apr 24 23:36:36.662492 kubelet[2195]: I0424 23:36:36.662347 2195 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:36:36.662492 kubelet[2195]: I0424 23:36:36.662404 2195 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 24 23:36:36.665561 kubelet[2195]: I0424 23:36:36.665527 2195 factory.go:223] Registration of the containerd container factory successfully Apr 24 23:36:36.679918 kubelet[2195]: I0424 23:36:36.679884 2195 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 24 23:36:36.680090 kubelet[2195]: I0424 23:36:36.680069 2195 status_manager.go:249] "Starting to sync pod status with apiserver" Apr 24 23:36:36.680180 kubelet[2195]: I0424 23:36:36.680168 2195 kubelet.go:2501] "Starting kubelet main sync loop" Apr 24 23:36:36.680323 kubelet[2195]: E0424 23:36:36.680302 2195 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 24 23:36:36.684280 kubelet[2195]: E0424 23:36:36.684202 2195 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://65.21.181.31:6443/api/v1/namespaces/default/events\": dial tcp 65.21.181.31:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-n-6f01bfed3c.18a96f33e419ec68 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-n-6f01bfed3c,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-6f01bfed3c,},FirstTimestamp:2026-04-24 23:36:36.635290728 +0000 UTC m=+0.178705830,LastTimestamp:2026-04-24 23:36:36.635290728 +0000 UTC m=+0.178705830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-6f01bfed3c,}" Apr 24 23:36:36.693681 kubelet[2195]: E0424 23:36:36.693653 2195 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 24 23:36:36.696601 kubelet[2195]: I0424 23:36:36.696587 2195 cpu_manager.go:225] "Starting" policy="none" Apr 24 23:36:36.696663 kubelet[2195]: I0424 23:36:36.696656 2195 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Apr 24 23:36:36.696875 kubelet[2195]: I0424 23:36:36.696702 2195 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Apr 24 23:36:36.698575 kubelet[2195]: I0424 23:36:36.698563 2195 policy_none.go:50] "Start" Apr 24 23:36:36.698639 kubelet[2195]: I0424 23:36:36.698633 2195 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 24 23:36:36.698676 kubelet[2195]: I0424 23:36:36.698669 2195 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 24 23:36:36.699873 kubelet[2195]: I0424 23:36:36.699864 2195 policy_none.go:44] "Start" Apr 24 23:36:36.703203 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Apr 24 23:36:36.711789 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Apr 24 23:36:36.714444 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Apr 24 23:36:36.727426 kubelet[2195]: E0424 23:36:36.727391 2195 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:36:36.727902 kubelet[2195]: I0424 23:36:36.727882 2195 eviction_manager.go:194] "Eviction manager: starting control loop" Apr 24 23:36:36.728211 kubelet[2195]: I0424 23:36:36.728162 2195 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:36:36.729251 kubelet[2195]: I0424 23:36:36.728959 2195 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Apr 24 23:36:36.729708 kubelet[2195]: E0424 23:36:36.729472 2195 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 24 23:36:36.729708 kubelet[2195]: E0424 23:36:36.729495 2195 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:36.791892 systemd[1]: Created slice kubepods-burstable-poda96190661d8a86a0d9d18fc3a2074e2c.slice - libcontainer container kubepods-burstable-poda96190661d8a86a0d9d18fc3a2074e2c.slice. Apr 24 23:36:36.799532 kubelet[2195]: E0424 23:36:36.799507 2195 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.801893 systemd[1]: Created slice kubepods-burstable-pod3e00f1c7c8486290ecb87066cb7eab88.slice - libcontainer container kubepods-burstable-pod3e00f1c7c8486290ecb87066cb7eab88.slice. Apr 24 23:36:36.804073 kubelet[2195]: E0424 23:36:36.803922 2195 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.805738 systemd[1]: Created slice kubepods-burstable-pod0a16de29227549d28bda0400ad93cb7a.slice - libcontainer container kubepods-burstable-pod0a16de29227549d28bda0400ad93cb7a.slice. Apr 24 23:36:36.807469 kubelet[2195]: E0424 23:36:36.807446 2195 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.829802 kubelet[2195]: I0424 23:36:36.829773 2195 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.830142 kubelet[2195]: E0424 23:36:36.830095 2195 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://65.21.181.31:6443/api/v1/nodes\": dial tcp 65.21.181.31:6443: connect: connection refused" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.851908 kubelet[2195]: I0424 23:36:36.851876 2195 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3e00f1c7c8486290ecb87066cb7eab88-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-6f01bfed3c\" (UID: \"3e00f1c7c8486290ecb87066cb7eab88\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.851908 kubelet[2195]: I0424 23:36:36.851900 2195 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3e00f1c7c8486290ecb87066cb7eab88-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-6f01bfed3c\" (UID: \"3e00f1c7c8486290ecb87066cb7eab88\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.852297 kubelet[2195]: I0424 23:36:36.852262 2195 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3e00f1c7c8486290ecb87066cb7eab88-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-6f01bfed3c\" (UID: \"3e00f1c7c8486290ecb87066cb7eab88\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.854434 kubelet[2195]: I0424 23:36:36.852636 2195 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0a16de29227549d28bda0400ad93cb7a-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-6f01bfed3c\" (UID: \"0a16de29227549d28bda0400ad93cb7a\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.854434 kubelet[2195]: I0424 23:36:36.852665 2195 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a96190661d8a86a0d9d18fc3a2074e2c-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-6f01bfed3c\" (UID: \"a96190661d8a86a0d9d18fc3a2074e2c\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.854434 kubelet[2195]: I0424 23:36:36.852699 2195 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3e00f1c7c8486290ecb87066cb7eab88-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-6f01bfed3c\" (UID: \"3e00f1c7c8486290ecb87066cb7eab88\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.854434 kubelet[2195]: I0424 23:36:36.852713 2195 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a96190661d8a86a0d9d18fc3a2074e2c-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-6f01bfed3c\" (UID: \"a96190661d8a86a0d9d18fc3a2074e2c\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.854434 kubelet[2195]: I0424 23:36:36.852792 2195 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a96190661d8a86a0d9d18fc3a2074e2c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-6f01bfed3c\" (UID: \"a96190661d8a86a0d9d18fc3a2074e2c\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.854639 kubelet[2195]: I0424 23:36:36.852887 2195 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3e00f1c7c8486290ecb87066cb7eab88-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-6f01bfed3c\" (UID: \"3e00f1c7c8486290ecb87066cb7eab88\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:36.863851 kubelet[2195]: E0424 23:36:36.863730 2195 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://65.21.181.31:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-6f01bfed3c?timeout=10s\": dial tcp 65.21.181.31:6443: connect: connection refused" interval="400ms" Apr 24 23:36:37.033167 kubelet[2195]: I0424 23:36:37.033119 2195 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:37.033589 kubelet[2195]: E0424 23:36:37.033553 2195 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://65.21.181.31:6443/api/v1/nodes\": dial tcp 65.21.181.31:6443: connect: connection refused" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:37.104876 containerd[1519]: time="2026-04-24T23:36:37.104809479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-6f01bfed3c,Uid:a96190661d8a86a0d9d18fc3a2074e2c,Namespace:kube-system,Attempt:0,}" Apr 24 23:36:37.107707 containerd[1519]: time="2026-04-24T23:36:37.107308941Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-6f01bfed3c,Uid:3e00f1c7c8486290ecb87066cb7eab88,Namespace:kube-system,Attempt:0,}" Apr 24 23:36:37.110749 containerd[1519]: time="2026-04-24T23:36:37.110692384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-6f01bfed3c,Uid:0a16de29227549d28bda0400ad93cb7a,Namespace:kube-system,Attempt:0,}" Apr 24 23:36:37.265346 kubelet[2195]: E0424 23:36:37.265122 2195 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://65.21.181.31:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-6f01bfed3c?timeout=10s\": dial tcp 65.21.181.31:6443: connect: connection refused" interval="800ms" Apr 24 23:36:37.436856 kubelet[2195]: I0424 23:36:37.436800 2195 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:37.437474 kubelet[2195]: E0424 23:36:37.437400 2195 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://65.21.181.31:6443/api/v1/nodes\": dial tcp 65.21.181.31:6443: connect: connection refused" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:37.575718 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3995799083.mount: Deactivated successfully. Apr 24 23:36:37.583394 containerd[1519]: time="2026-04-24T23:36:37.583285308Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:36:37.586362 containerd[1519]: time="2026-04-24T23:36:37.586238330Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312078" Apr 24 23:36:37.587037 containerd[1519]: time="2026-04-24T23:36:37.586971291Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:36:37.588617 containerd[1519]: time="2026-04-24T23:36:37.588463502Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:36:37.589589 containerd[1519]: time="2026-04-24T23:36:37.589541993Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 24 23:36:37.590980 containerd[1519]: time="2026-04-24T23:36:37.590798064Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 24 23:36:37.590980 containerd[1519]: time="2026-04-24T23:36:37.590900654Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:36:37.596192 containerd[1519]: time="2026-04-24T23:36:37.596150229Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:36:37.598995 containerd[1519]: time="2026-04-24T23:36:37.598717301Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 493.801782ms" Apr 24 23:36:37.601969 containerd[1519]: time="2026-04-24T23:36:37.601890403Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 491.097639ms" Apr 24 23:36:37.603718 containerd[1519]: time="2026-04-24T23:36:37.603640475Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 496.228234ms" Apr 24 23:36:37.715220 containerd[1519]: time="2026-04-24T23:36:37.714693747Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:36:37.715220 containerd[1519]: time="2026-04-24T23:36:37.714746917Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:36:37.715220 containerd[1519]: time="2026-04-24T23:36:37.714758467Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:37.715220 containerd[1519]: time="2026-04-24T23:36:37.714931838Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:37.717155 containerd[1519]: time="2026-04-24T23:36:37.717064339Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:36:37.717155 containerd[1519]: time="2026-04-24T23:36:37.717132049Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:36:37.717241 containerd[1519]: time="2026-04-24T23:36:37.717142299Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:37.717317 containerd[1519]: time="2026-04-24T23:36:37.717212479Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:37.723993 containerd[1519]: time="2026-04-24T23:36:37.723807035Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:36:37.723993 containerd[1519]: time="2026-04-24T23:36:37.723864725Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:36:37.723993 containerd[1519]: time="2026-04-24T23:36:37.723877045Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:37.723993 containerd[1519]: time="2026-04-24T23:36:37.723942025Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:37.744556 systemd[1]: Started cri-containerd-54b690c2d58d18b0ca98a3f43dca51188d0bcb4c01c5a87887497f286049352f.scope - libcontainer container 54b690c2d58d18b0ca98a3f43dca51188d0bcb4c01c5a87887497f286049352f. Apr 24 23:36:37.749227 systemd[1]: Started cri-containerd-12972b834c644ffc4dfeccdd68eb94f265be63636e5267742a80f25544890ee2.scope - libcontainer container 12972b834c644ffc4dfeccdd68eb94f265be63636e5267742a80f25544890ee2. Apr 24 23:36:37.750763 systemd[1]: Started cri-containerd-29ee7268bed005196cc033ed56dec09f63c55d8e1ce8a38599084e6f85750a7a.scope - libcontainer container 29ee7268bed005196cc033ed56dec09f63c55d8e1ce8a38599084e6f85750a7a. Apr 24 23:36:37.790763 containerd[1519]: time="2026-04-24T23:36:37.790724271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-6f01bfed3c,Uid:a96190661d8a86a0d9d18fc3a2074e2c,Namespace:kube-system,Attempt:0,} returns sandbox id \"54b690c2d58d18b0ca98a3f43dca51188d0bcb4c01c5a87887497f286049352f\"" Apr 24 23:36:37.799517 containerd[1519]: time="2026-04-24T23:36:37.799443938Z" level=info msg="CreateContainer within sandbox \"54b690c2d58d18b0ca98a3f43dca51188d0bcb4c01c5a87887497f286049352f\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Apr 24 23:36:37.808586 containerd[1519]: time="2026-04-24T23:36:37.808567506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-6f01bfed3c,Uid:0a16de29227549d28bda0400ad93cb7a,Namespace:kube-system,Attempt:0,} returns sandbox id \"29ee7268bed005196cc033ed56dec09f63c55d8e1ce8a38599084e6f85750a7a\"" Apr 24 23:36:37.812351 containerd[1519]: time="2026-04-24T23:36:37.812269989Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-6f01bfed3c,Uid:3e00f1c7c8486290ecb87066cb7eab88,Namespace:kube-system,Attempt:0,} returns sandbox id \"12972b834c644ffc4dfeccdd68eb94f265be63636e5267742a80f25544890ee2\"" Apr 24 23:36:37.815632 containerd[1519]: time="2026-04-24T23:36:37.815602441Z" level=info msg="CreateContainer within sandbox \"29ee7268bed005196cc033ed56dec09f63c55d8e1ce8a38599084e6f85750a7a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Apr 24 23:36:37.816979 containerd[1519]: time="2026-04-24T23:36:37.816941523Z" level=info msg="CreateContainer within sandbox \"12972b834c644ffc4dfeccdd68eb94f265be63636e5267742a80f25544890ee2\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Apr 24 23:36:37.827684 containerd[1519]: time="2026-04-24T23:36:37.827453981Z" level=info msg="CreateContainer within sandbox \"54b690c2d58d18b0ca98a3f43dca51188d0bcb4c01c5a87887497f286049352f\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"5763a7c9c8a57beffb9c3bcc0ebe81308db087efc4a471c38975e29a4ac06161\"" Apr 24 23:36:37.828788 containerd[1519]: time="2026-04-24T23:36:37.827916552Z" level=info msg="StartContainer for \"5763a7c9c8a57beffb9c3bcc0ebe81308db087efc4a471c38975e29a4ac06161\"" Apr 24 23:36:37.832066 containerd[1519]: time="2026-04-24T23:36:37.832030605Z" level=info msg="CreateContainer within sandbox \"29ee7268bed005196cc033ed56dec09f63c55d8e1ce8a38599084e6f85750a7a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3e9b418cd350a1dcaa3972d87165dceb602b839139f4f5a650fd1a76d54fce4e\"" Apr 24 23:36:37.832522 containerd[1519]: time="2026-04-24T23:36:37.832462035Z" level=info msg="StartContainer for \"3e9b418cd350a1dcaa3972d87165dceb602b839139f4f5a650fd1a76d54fce4e\"" Apr 24 23:36:37.836631 containerd[1519]: time="2026-04-24T23:36:37.836601749Z" level=info msg="CreateContainer within sandbox \"12972b834c644ffc4dfeccdd68eb94f265be63636e5267742a80f25544890ee2\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"64436aef0982207cfe22331c2503cd39b4888494dee70e51a4d86e51e767dbc6\"" Apr 24 23:36:37.836974 containerd[1519]: time="2026-04-24T23:36:37.836958749Z" level=info msg="StartContainer for \"64436aef0982207cfe22331c2503cd39b4888494dee70e51a4d86e51e767dbc6\"" Apr 24 23:36:37.858536 systemd[1]: Started cri-containerd-5763a7c9c8a57beffb9c3bcc0ebe81308db087efc4a471c38975e29a4ac06161.scope - libcontainer container 5763a7c9c8a57beffb9c3bcc0ebe81308db087efc4a471c38975e29a4ac06161. Apr 24 23:36:37.870520 systemd[1]: Started cri-containerd-3e9b418cd350a1dcaa3972d87165dceb602b839139f4f5a650fd1a76d54fce4e.scope - libcontainer container 3e9b418cd350a1dcaa3972d87165dceb602b839139f4f5a650fd1a76d54fce4e. Apr 24 23:36:37.874447 systemd[1]: Started cri-containerd-64436aef0982207cfe22331c2503cd39b4888494dee70e51a4d86e51e767dbc6.scope - libcontainer container 64436aef0982207cfe22331c2503cd39b4888494dee70e51a4d86e51e767dbc6. Apr 24 23:36:37.910588 containerd[1519]: time="2026-04-24T23:36:37.910246500Z" level=info msg="StartContainer for \"5763a7c9c8a57beffb9c3bcc0ebe81308db087efc4a471c38975e29a4ac06161\" returns successfully" Apr 24 23:36:37.938248 containerd[1519]: time="2026-04-24T23:36:37.938130224Z" level=info msg="StartContainer for \"64436aef0982207cfe22331c2503cd39b4888494dee70e51a4d86e51e767dbc6\" returns successfully" Apr 24 23:36:37.944720 containerd[1519]: time="2026-04-24T23:36:37.944591119Z" level=info msg="StartContainer for \"3e9b418cd350a1dcaa3972d87165dceb602b839139f4f5a650fd1a76d54fce4e\" returns successfully" Apr 24 23:36:38.242165 kubelet[2195]: I0424 23:36:38.241455 2195 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:38.650496 kubelet[2195]: E0424 23:36:38.650002 2195 nodelease.go:50] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-6-n-6f01bfed3c\" not found" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:38.704055 kubelet[2195]: E0424 23:36:38.704022 2195 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:38.704512 kubelet[2195]: E0424 23:36:38.704451 2195 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:38.704933 kubelet[2195]: E0424 23:36:38.704905 2195 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:38.844451 kubelet[2195]: I0424 23:36:38.843525 2195 kubelet_node_status.go:77] "Successfully registered node" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:38.844451 kubelet[2195]: E0424 23:36:38.843556 2195 kubelet_node_status.go:474] "Error updating node status, will retry" err="error getting node \"ci-4081-3-6-n-6f01bfed3c\": node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:38.853989 kubelet[2195]: E0424 23:36:38.853955 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:38.954769 kubelet[2195]: E0424 23:36:38.954620 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:39.055593 kubelet[2195]: E0424 23:36:39.055477 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:39.156445 kubelet[2195]: E0424 23:36:39.156311 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:39.257469 kubelet[2195]: E0424 23:36:39.257274 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:39.358174 kubelet[2195]: E0424 23:36:39.358085 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:39.459322 kubelet[2195]: E0424 23:36:39.459238 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:39.560670 kubelet[2195]: E0424 23:36:39.560479 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:39.661366 kubelet[2195]: E0424 23:36:39.661304 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:39.709989 kubelet[2195]: E0424 23:36:39.709872 2195 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:39.710797 kubelet[2195]: E0424 23:36:39.710461 2195 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:39.762023 kubelet[2195]: E0424 23:36:39.761966 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:39.862817 kubelet[2195]: E0424 23:36:39.862643 2195 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-6f01bfed3c\" not found" Apr 24 23:36:39.950493 kubelet[2195]: I0424 23:36:39.950450 2195 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:39.956707 kubelet[2195]: I0424 23:36:39.956677 2195 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:39.960391 kubelet[2195]: I0424 23:36:39.960344 2195 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:40.629275 kubelet[2195]: I0424 23:36:40.629221 2195 apiserver.go:52] "Watching apiserver" Apr 24 23:36:40.651085 kubelet[2195]: I0424 23:36:40.651033 2195 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 24 23:36:40.921462 systemd[1]: Reloading requested from client PID 2475 ('systemctl') (unit session-7.scope)... Apr 24 23:36:40.921489 systemd[1]: Reloading... Apr 24 23:36:41.010474 zram_generator::config[2513]: No configuration found. Apr 24 23:36:41.098249 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:36:41.172660 systemd[1]: Reloading finished in 250 ms. Apr 24 23:36:41.219891 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:36:41.220308 kubelet[2195]: I0424 23:36:41.220032 2195 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 24 23:36:41.230334 systemd[1]: kubelet.service: Deactivated successfully. Apr 24 23:36:41.230539 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:36:41.235827 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:36:41.379783 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:36:41.384392 (kubelet)[2565]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 24 23:36:41.427366 kubelet[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:36:41.434141 kubelet[2565]: I0424 23:36:41.434087 2565 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Apr 24 23:36:41.434141 kubelet[2565]: I0424 23:36:41.434136 2565 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:36:41.434237 kubelet[2565]: I0424 23:36:41.434155 2565 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 24 23:36:41.434237 kubelet[2565]: I0424 23:36:41.434161 2565 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:36:41.434383 kubelet[2565]: I0424 23:36:41.434365 2565 server.go:951] "Client rotation is on, will bootstrap in background" Apr 24 23:36:41.435512 kubelet[2565]: I0424 23:36:41.435494 2565 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Apr 24 23:36:41.437147 kubelet[2565]: I0424 23:36:41.437040 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 24 23:36:41.442799 kubelet[2565]: E0424 23:36:41.442780 2565 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 24 23:36:41.442942 kubelet[2565]: I0424 23:36:41.442934 2565 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Apr 24 23:36:41.446288 kubelet[2565]: I0424 23:36:41.445774 2565 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 24 23:36:41.446288 kubelet[2565]: I0424 23:36:41.445948 2565 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:36:41.446288 kubelet[2565]: I0424 23:36:41.445964 2565 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-6f01bfed3c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:36:41.446288 kubelet[2565]: I0424 23:36:41.446056 2565 topology_manager.go:143] "Creating topology manager with none policy" Apr 24 23:36:41.449176 kubelet[2565]: I0424 23:36:41.446063 2565 container_manager_linux.go:308] "Creating device plugin manager" Apr 24 23:36:41.449176 kubelet[2565]: I0424 23:36:41.446079 2565 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Apr 24 23:36:41.449176 kubelet[2565]: I0424 23:36:41.446209 2565 state_mem.go:41] "Initialized" logger="CPUManager state memory" Apr 24 23:36:41.449176 kubelet[2565]: I0424 23:36:41.446334 2565 kubelet.go:482] "Attempting to sync node with API server" Apr 24 23:36:41.449176 kubelet[2565]: I0424 23:36:41.446347 2565 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:36:41.449176 kubelet[2565]: I0424 23:36:41.446357 2565 kubelet.go:394] "Adding apiserver pod source" Apr 24 23:36:41.449176 kubelet[2565]: I0424 23:36:41.446364 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:36:41.449176 kubelet[2565]: I0424 23:36:41.447515 2565 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 24 23:36:41.449176 kubelet[2565]: I0424 23:36:41.448611 2565 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:36:41.449176 kubelet[2565]: I0424 23:36:41.448630 2565 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 24 23:36:41.454108 kubelet[2565]: I0424 23:36:41.454094 2565 server.go:1257] "Started kubelet" Apr 24 23:36:41.455313 kubelet[2565]: I0424 23:36:41.455299 2565 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Apr 24 23:36:41.465528 kubelet[2565]: I0424 23:36:41.465369 2565 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:36:41.466381 kubelet[2565]: I0424 23:36:41.466363 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:36:41.467192 kubelet[2565]: I0424 23:36:41.467154 2565 volume_manager.go:311] "Starting Kubelet Volume Manager" Apr 24 23:36:41.467680 kubelet[2565]: I0424 23:36:41.467599 2565 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 24 23:36:41.467728 kubelet[2565]: I0424 23:36:41.467719 2565 reconciler.go:29] "Reconciler: start to sync state" Apr 24 23:36:41.473158 kubelet[2565]: I0424 23:36:41.473107 2565 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:36:41.473459 kubelet[2565]: I0424 23:36:41.473231 2565 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 24 23:36:41.473459 kubelet[2565]: I0424 23:36:41.473378 2565 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:36:41.473680 kubelet[2565]: I0424 23:36:41.473668 2565 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 24 23:36:41.473741 kubelet[2565]: I0424 23:36:41.473721 2565 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 24 23:36:41.475439 kubelet[2565]: I0424 23:36:41.475405 2565 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 24 23:36:41.475485 kubelet[2565]: I0424 23:36:41.475443 2565 status_manager.go:249] "Starting to sync pod status with apiserver" Apr 24 23:36:41.475485 kubelet[2565]: I0424 23:36:41.475462 2565 kubelet.go:2501] "Starting kubelet main sync loop" Apr 24 23:36:41.475526 kubelet[2565]: E0424 23:36:41.475497 2565 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 24 23:36:41.475783 kubelet[2565]: I0424 23:36:41.475771 2565 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:36:41.475939 kubelet[2565]: I0424 23:36:41.475923 2565 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 24 23:36:41.482283 kubelet[2565]: I0424 23:36:41.481975 2565 factory.go:223] Registration of the containerd container factory successfully Apr 24 23:36:41.522433 kubelet[2565]: I0424 23:36:41.522261 2565 cpu_manager.go:225] "Starting" policy="none" Apr 24 23:36:41.522433 kubelet[2565]: I0424 23:36:41.522273 2565 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Apr 24 23:36:41.522433 kubelet[2565]: I0424 23:36:41.522286 2565 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Apr 24 23:36:41.522433 kubelet[2565]: I0424 23:36:41.522378 2565 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Apr 24 23:36:41.522433 kubelet[2565]: I0424 23:36:41.522385 2565 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Apr 24 23:36:41.522433 kubelet[2565]: I0424 23:36:41.522398 2565 policy_none.go:50] "Start" Apr 24 23:36:41.523302 kubelet[2565]: I0424 23:36:41.522405 2565 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 24 23:36:41.523324 kubelet[2565]: I0424 23:36:41.523302 2565 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 24 23:36:41.523655 kubelet[2565]: I0424 23:36:41.523644 2565 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Apr 24 23:36:41.523655 kubelet[2565]: I0424 23:36:41.523655 2565 policy_none.go:44] "Start" Apr 24 23:36:41.530633 kubelet[2565]: E0424 23:36:41.530002 2565 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:36:41.530633 kubelet[2565]: I0424 23:36:41.530137 2565 eviction_manager.go:194] "Eviction manager: starting control loop" Apr 24 23:36:41.530633 kubelet[2565]: I0424 23:36:41.530149 2565 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:36:41.530633 kubelet[2565]: I0424 23:36:41.530291 2565 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Apr 24 23:36:41.532556 kubelet[2565]: E0424 23:36:41.532537 2565 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 24 23:36:41.576882 kubelet[2565]: I0424 23:36:41.576857 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.577086 kubelet[2565]: I0424 23:36:41.576857 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.577351 kubelet[2565]: I0424 23:36:41.576938 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.583245 kubelet[2565]: E0424 23:36:41.583209 2565 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-6f01bfed3c\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.583862 kubelet[2565]: E0424 23:36:41.583838 2565 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-n-6f01bfed3c\" already exists" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.584080 kubelet[2565]: E0424 23:36:41.584038 2565 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-6f01bfed3c\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.637791 kubelet[2565]: I0424 23:36:41.637716 2565 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.648973 kubelet[2565]: I0424 23:36:41.648336 2565 kubelet_node_status.go:123] "Node was previously registered" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.648973 kubelet[2565]: I0424 23:36:41.648468 2565 kubelet_node_status.go:77] "Successfully registered node" node="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.770085 kubelet[2565]: I0424 23:36:41.768538 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0a16de29227549d28bda0400ad93cb7a-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-6f01bfed3c\" (UID: \"0a16de29227549d28bda0400ad93cb7a\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.770085 kubelet[2565]: I0424 23:36:41.768574 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a96190661d8a86a0d9d18fc3a2074e2c-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-6f01bfed3c\" (UID: \"a96190661d8a86a0d9d18fc3a2074e2c\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.770085 kubelet[2565]: I0424 23:36:41.768586 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3e00f1c7c8486290ecb87066cb7eab88-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-6f01bfed3c\" (UID: \"3e00f1c7c8486290ecb87066cb7eab88\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.770085 kubelet[2565]: I0424 23:36:41.768600 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3e00f1c7c8486290ecb87066cb7eab88-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-6f01bfed3c\" (UID: \"3e00f1c7c8486290ecb87066cb7eab88\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.770085 kubelet[2565]: I0424 23:36:41.768614 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3e00f1c7c8486290ecb87066cb7eab88-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-6f01bfed3c\" (UID: \"3e00f1c7c8486290ecb87066cb7eab88\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.770303 kubelet[2565]: I0424 23:36:41.768626 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a96190661d8a86a0d9d18fc3a2074e2c-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-6f01bfed3c\" (UID: \"a96190661d8a86a0d9d18fc3a2074e2c\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.770303 kubelet[2565]: I0424 23:36:41.768644 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a96190661d8a86a0d9d18fc3a2074e2c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-6f01bfed3c\" (UID: \"a96190661d8a86a0d9d18fc3a2074e2c\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.770303 kubelet[2565]: I0424 23:36:41.768656 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3e00f1c7c8486290ecb87066cb7eab88-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-6f01bfed3c\" (UID: \"3e00f1c7c8486290ecb87066cb7eab88\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:41.770303 kubelet[2565]: I0424 23:36:41.768668 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3e00f1c7c8486290ecb87066cb7eab88-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-6f01bfed3c\" (UID: \"3e00f1c7c8486290ecb87066cb7eab88\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:42.447725 kubelet[2565]: I0424 23:36:42.447473 2565 apiserver.go:52] "Watching apiserver" Apr 24 23:36:42.468461 kubelet[2565]: I0424 23:36:42.468391 2565 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 24 23:36:42.515711 kubelet[2565]: I0424 23:36:42.513447 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:42.515711 kubelet[2565]: I0424 23:36:42.513699 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:42.523218 kubelet[2565]: E0424 23:36:42.523185 2565 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-6f01bfed3c\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:42.523798 kubelet[2565]: E0424 23:36:42.523787 2565 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-6f01bfed3c\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-n-6f01bfed3c" Apr 24 23:36:43.550950 kubelet[2565]: I0424 23:36:43.550814 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-6-n-6f01bfed3c" podStartSLOduration=4.550801019 podStartE2EDuration="4.550801019s" podCreationTimestamp="2026-04-24 23:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:36:43.538556289 +0000 UTC m=+2.150615093" watchObservedRunningTime="2026-04-24 23:36:43.550801019 +0000 UTC m=+2.162859813" Apr 24 23:36:43.559633 kubelet[2565]: I0424 23:36:43.559539 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-6f01bfed3c" podStartSLOduration=4.559529787 podStartE2EDuration="4.559529787s" podCreationTimestamp="2026-04-24 23:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:36:43.55195304 +0000 UTC m=+2.164011844" watchObservedRunningTime="2026-04-24 23:36:43.559529787 +0000 UTC m=+2.171588591" Apr 24 23:36:43.559633 kubelet[2565]: I0424 23:36:43.559602 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-6-n-6f01bfed3c" podStartSLOduration=4.559599747 podStartE2EDuration="4.559599747s" podCreationTimestamp="2026-04-24 23:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:36:43.559389346 +0000 UTC m=+2.171448140" watchObservedRunningTime="2026-04-24 23:36:43.559599747 +0000 UTC m=+2.171658541" Apr 24 23:36:47.484400 kubelet[2565]: I0424 23:36:47.484361 2565 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Apr 24 23:36:47.484823 containerd[1519]: time="2026-04-24T23:36:47.484784057Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Apr 24 23:36:47.485029 kubelet[2565]: I0424 23:36:47.484974 2565 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Apr 24 23:36:48.679064 systemd[1]: Created slice kubepods-besteffort-pod4ab1b1cc_cb20_4a9f_a1d0_1e62c6045d98.slice - libcontainer container kubepods-besteffort-pod4ab1b1cc_cb20_4a9f_a1d0_1e62c6045d98.slice. Apr 24 23:36:48.716696 kubelet[2565]: I0424 23:36:48.716542 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4ab1b1cc-cb20-4a9f-a1d0-1e62c6045d98-xtables-lock\") pod \"kube-proxy-nfszg\" (UID: \"4ab1b1cc-cb20-4a9f-a1d0-1e62c6045d98\") " pod="kube-system/kube-proxy-nfszg" Apr 24 23:36:48.716696 kubelet[2565]: I0424 23:36:48.716577 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ab1b1cc-cb20-4a9f-a1d0-1e62c6045d98-lib-modules\") pod \"kube-proxy-nfszg\" (UID: \"4ab1b1cc-cb20-4a9f-a1d0-1e62c6045d98\") " pod="kube-system/kube-proxy-nfszg" Apr 24 23:36:48.716696 kubelet[2565]: I0424 23:36:48.716589 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snjl7\" (UniqueName: \"kubernetes.io/projected/4ab1b1cc-cb20-4a9f-a1d0-1e62c6045d98-kube-api-access-snjl7\") pod \"kube-proxy-nfszg\" (UID: \"4ab1b1cc-cb20-4a9f-a1d0-1e62c6045d98\") " pod="kube-system/kube-proxy-nfszg" Apr 24 23:36:48.716696 kubelet[2565]: I0424 23:36:48.716610 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/4ab1b1cc-cb20-4a9f-a1d0-1e62c6045d98-kube-proxy\") pod \"kube-proxy-nfszg\" (UID: \"4ab1b1cc-cb20-4a9f-a1d0-1e62c6045d98\") " pod="kube-system/kube-proxy-nfszg" Apr 24 23:36:48.769189 systemd[1]: Created slice kubepods-besteffort-pod614611a9_22cd_424e_999a_1cb3e3f2aa5f.slice - libcontainer container kubepods-besteffort-pod614611a9_22cd_424e_999a_1cb3e3f2aa5f.slice. Apr 24 23:36:48.817769 kubelet[2565]: I0424 23:36:48.817678 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgn24\" (UniqueName: \"kubernetes.io/projected/614611a9-22cd-424e-999a-1cb3e3f2aa5f-kube-api-access-hgn24\") pod \"tigera-operator-6cf4cccc57-2gkdq\" (UID: \"614611a9-22cd-424e-999a-1cb3e3f2aa5f\") " pod="tigera-operator/tigera-operator-6cf4cccc57-2gkdq" Apr 24 23:36:48.817769 kubelet[2565]: I0424 23:36:48.817761 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/614611a9-22cd-424e-999a-1cb3e3f2aa5f-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-2gkdq\" (UID: \"614611a9-22cd-424e-999a-1cb3e3f2aa5f\") " pod="tigera-operator/tigera-operator-6cf4cccc57-2gkdq" Apr 24 23:36:48.992084 containerd[1519]: time="2026-04-24T23:36:48.991518592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nfszg,Uid:4ab1b1cc-cb20-4a9f-a1d0-1e62c6045d98,Namespace:kube-system,Attempt:0,}" Apr 24 23:36:49.035497 containerd[1519]: time="2026-04-24T23:36:49.034891628Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:36:49.035497 containerd[1519]: time="2026-04-24T23:36:49.035060308Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:36:49.035497 containerd[1519]: time="2026-04-24T23:36:49.035116478Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:49.036310 containerd[1519]: time="2026-04-24T23:36:49.036129589Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:49.060566 systemd[1]: Started cri-containerd-0038b659827f9790fb76eadb029d70f868139fd38a0e2b8ef27ba1228fffb1cb.scope - libcontainer container 0038b659827f9790fb76eadb029d70f868139fd38a0e2b8ef27ba1228fffb1cb. Apr 24 23:36:49.074698 containerd[1519]: time="2026-04-24T23:36:49.074426841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-2gkdq,Uid:614611a9-22cd-424e-999a-1cb3e3f2aa5f,Namespace:tigera-operator,Attempt:0,}" Apr 24 23:36:49.083700 containerd[1519]: time="2026-04-24T23:36:49.083672599Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nfszg,Uid:4ab1b1cc-cb20-4a9f-a1d0-1e62c6045d98,Namespace:kube-system,Attempt:0,} returns sandbox id \"0038b659827f9790fb76eadb029d70f868139fd38a0e2b8ef27ba1228fffb1cb\"" Apr 24 23:36:49.090766 containerd[1519]: time="2026-04-24T23:36:49.090697254Z" level=info msg="CreateContainer within sandbox \"0038b659827f9790fb76eadb029d70f868139fd38a0e2b8ef27ba1228fffb1cb\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Apr 24 23:36:49.099494 containerd[1519]: time="2026-04-24T23:36:49.099339802Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:36:49.099494 containerd[1519]: time="2026-04-24T23:36:49.099467192Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:36:49.100182 containerd[1519]: time="2026-04-24T23:36:49.099487752Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:49.100182 containerd[1519]: time="2026-04-24T23:36:49.099695532Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:49.111588 containerd[1519]: time="2026-04-24T23:36:49.111552512Z" level=info msg="CreateContainer within sandbox \"0038b659827f9790fb76eadb029d70f868139fd38a0e2b8ef27ba1228fffb1cb\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"994dfbce26cbb5e1a8855ddcc635e7736974e87832d64ebb864f5fd628000ad4\"" Apr 24 23:36:49.113864 containerd[1519]: time="2026-04-24T23:36:49.113768534Z" level=info msg="StartContainer for \"994dfbce26cbb5e1a8855ddcc635e7736974e87832d64ebb864f5fd628000ad4\"" Apr 24 23:36:49.119253 systemd[1]: Started cri-containerd-0c87031f46f54f9786038c0b1e9492bf7e2ef5adf2e0d0fc0b80cea16e916248.scope - libcontainer container 0c87031f46f54f9786038c0b1e9492bf7e2ef5adf2e0d0fc0b80cea16e916248. Apr 24 23:36:49.141535 systemd[1]: Started cri-containerd-994dfbce26cbb5e1a8855ddcc635e7736974e87832d64ebb864f5fd628000ad4.scope - libcontainer container 994dfbce26cbb5e1a8855ddcc635e7736974e87832d64ebb864f5fd628000ad4. Apr 24 23:36:49.158011 containerd[1519]: time="2026-04-24T23:36:49.157871160Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-2gkdq,Uid:614611a9-22cd-424e-999a-1cb3e3f2aa5f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0c87031f46f54f9786038c0b1e9492bf7e2ef5adf2e0d0fc0b80cea16e916248\"" Apr 24 23:36:49.159649 containerd[1519]: time="2026-04-24T23:36:49.159536572Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Apr 24 23:36:49.173142 containerd[1519]: time="2026-04-24T23:36:49.173109513Z" level=info msg="StartContainer for \"994dfbce26cbb5e1a8855ddcc635e7736974e87832d64ebb864f5fd628000ad4\" returns successfully" Apr 24 23:36:49.549378 kubelet[2565]: I0424 23:36:49.549330 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-nfszg" podStartSLOduration=1.549319016 podStartE2EDuration="1.549319016s" podCreationTimestamp="2026-04-24 23:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:36:49.549061126 +0000 UTC m=+8.161119930" watchObservedRunningTime="2026-04-24 23:36:49.549319016 +0000 UTC m=+8.161377820" Apr 24 23:36:51.055678 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount307579227.mount: Deactivated successfully. Apr 24 23:36:51.811845 containerd[1519]: time="2026-04-24T23:36:51.811784901Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:51.812987 containerd[1519]: time="2026-04-24T23:36:51.812863352Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Apr 24 23:36:51.813948 containerd[1519]: time="2026-04-24T23:36:51.813835163Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:51.815651 containerd[1519]: time="2026-04-24T23:36:51.815619164Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:51.816561 containerd[1519]: time="2026-04-24T23:36:51.816071195Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.656513973s" Apr 24 23:36:51.816561 containerd[1519]: time="2026-04-24T23:36:51.816104495Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Apr 24 23:36:51.821485 containerd[1519]: time="2026-04-24T23:36:51.821454019Z" level=info msg="CreateContainer within sandbox \"0c87031f46f54f9786038c0b1e9492bf7e2ef5adf2e0d0fc0b80cea16e916248\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Apr 24 23:36:51.832169 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2191779222.mount: Deactivated successfully. Apr 24 23:36:51.833368 containerd[1519]: time="2026-04-24T23:36:51.833325209Z" level=info msg="CreateContainer within sandbox \"0c87031f46f54f9786038c0b1e9492bf7e2ef5adf2e0d0fc0b80cea16e916248\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7\"" Apr 24 23:36:51.833987 containerd[1519]: time="2026-04-24T23:36:51.833757850Z" level=info msg="StartContainer for \"182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7\"" Apr 24 23:36:51.862534 systemd[1]: Started cri-containerd-182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7.scope - libcontainer container 182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7. Apr 24 23:36:51.885104 containerd[1519]: time="2026-04-24T23:36:51.885009562Z" level=info msg="StartContainer for \"182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7\" returns successfully" Apr 24 23:36:56.202396 systemd-timesyncd[1435]: Contacted time server 46.21.2.169:123 (2.flatcar.pool.ntp.org). Apr 24 23:36:56.202614 systemd-timesyncd[1435]: Initial clock synchronization to Fri 2026-04-24 23:36:56.522673 UTC. Apr 24 23:36:56.795160 sudo[1708]: pam_unix(sudo:session): session closed for user root Apr 24 23:36:56.829606 sshd[1705]: pam_unix(sshd:session): session closed for user core Apr 24 23:36:56.834246 systemd[1]: sshd@6-65.21.181.31:22-4.175.71.9:32948.service: Deactivated successfully. Apr 24 23:36:56.837079 systemd[1]: session-7.scope: Deactivated successfully. Apr 24 23:36:56.838498 systemd[1]: session-7.scope: Consumed 2.925s CPU time, 161.1M memory peak, 0B memory swap peak. Apr 24 23:36:56.841092 systemd-logind[1501]: Session 7 logged out. Waiting for processes to exit. Apr 24 23:36:56.843647 systemd-logind[1501]: Removed session 7. Apr 24 23:36:58.531457 kubelet[2565]: I0424 23:36:58.531353 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-2gkdq" podStartSLOduration=7.873458294 podStartE2EDuration="10.531339889s" podCreationTimestamp="2026-04-24 23:36:48 +0000 UTC" firstStartedPulling="2026-04-24 23:36:49.159113911 +0000 UTC m=+7.771172705" lastFinishedPulling="2026-04-24 23:36:51.816995506 +0000 UTC m=+10.429054300" observedRunningTime="2026-04-24 23:36:52.562692497 +0000 UTC m=+11.174751341" watchObservedRunningTime="2026-04-24 23:36:58.531339889 +0000 UTC m=+17.143398694" Apr 24 23:36:58.542875 systemd[1]: Created slice kubepods-besteffort-pod0aa6fa68_a301_4546_b8a8_91ae82ea93d2.slice - libcontainer container kubepods-besteffort-pod0aa6fa68_a301_4546_b8a8_91ae82ea93d2.slice. Apr 24 23:36:58.587971 kubelet[2565]: I0424 23:36:58.587860 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0aa6fa68-a301-4546-b8a8-91ae82ea93d2-typha-certs\") pod \"calico-typha-6c85bdfbb6-hz5tb\" (UID: \"0aa6fa68-a301-4546-b8a8-91ae82ea93d2\") " pod="calico-system/calico-typha-6c85bdfbb6-hz5tb" Apr 24 23:36:58.587971 kubelet[2565]: I0424 23:36:58.587888 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0aa6fa68-a301-4546-b8a8-91ae82ea93d2-tigera-ca-bundle\") pod \"calico-typha-6c85bdfbb6-hz5tb\" (UID: \"0aa6fa68-a301-4546-b8a8-91ae82ea93d2\") " pod="calico-system/calico-typha-6c85bdfbb6-hz5tb" Apr 24 23:36:58.587971 kubelet[2565]: I0424 23:36:58.587901 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqb6x\" (UniqueName: \"kubernetes.io/projected/0aa6fa68-a301-4546-b8a8-91ae82ea93d2-kube-api-access-hqb6x\") pod \"calico-typha-6c85bdfbb6-hz5tb\" (UID: \"0aa6fa68-a301-4546-b8a8-91ae82ea93d2\") " pod="calico-system/calico-typha-6c85bdfbb6-hz5tb" Apr 24 23:36:58.611311 systemd[1]: Created slice kubepods-besteffort-pod122ece37_52f9_4bd5_94de_b46e4cbcc814.slice - libcontainer container kubepods-besteffort-pod122ece37_52f9_4bd5_94de_b46e4cbcc814.slice. Apr 24 23:36:58.688436 kubelet[2565]: I0424 23:36:58.688376 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-policysync\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689227 kubelet[2565]: I0424 23:36:58.688582 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-var-lib-calico\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689227 kubelet[2565]: I0424 23:36:58.688605 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/122ece37-52f9-4bd5-94de-b46e4cbcc814-node-certs\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689227 kubelet[2565]: I0424 23:36:58.688625 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-cni-log-dir\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689227 kubelet[2565]: I0424 23:36:58.688640 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-nodeproc\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689227 kubelet[2565]: I0424 23:36:58.688657 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/122ece37-52f9-4bd5-94de-b46e4cbcc814-tigera-ca-bundle\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689364 kubelet[2565]: I0424 23:36:58.688667 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-cni-bin-dir\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689364 kubelet[2565]: I0424 23:36:58.688677 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-lib-modules\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689364 kubelet[2565]: I0424 23:36:58.688687 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-cni-net-dir\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689364 kubelet[2565]: I0424 23:36:58.688724 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-sys-fs\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689364 kubelet[2565]: I0424 23:36:58.688736 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-bpffs\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689471 kubelet[2565]: I0424 23:36:58.688748 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-xtables-lock\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689471 kubelet[2565]: I0424 23:36:58.688759 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcdm2\" (UniqueName: \"kubernetes.io/projected/122ece37-52f9-4bd5-94de-b46e4cbcc814-kube-api-access-hcdm2\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689471 kubelet[2565]: I0424 23:36:58.688770 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-flexvol-driver-host\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.689471 kubelet[2565]: I0424 23:36:58.688783 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/122ece37-52f9-4bd5-94de-b46e4cbcc814-var-run-calico\") pod \"calico-node-h74r6\" (UID: \"122ece37-52f9-4bd5-94de-b46e4cbcc814\") " pod="calico-system/calico-node-h74r6" Apr 24 23:36:58.706498 kubelet[2565]: E0424 23:36:58.706239 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kfkvn" podUID="b8636bb0-afe6-4f7c-8c11-499ada35dc8f" Apr 24 23:36:58.790087 kubelet[2565]: I0424 23:36:58.789874 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8636bb0-afe6-4f7c-8c11-499ada35dc8f-kubelet-dir\") pod \"csi-node-driver-kfkvn\" (UID: \"b8636bb0-afe6-4f7c-8c11-499ada35dc8f\") " pod="calico-system/csi-node-driver-kfkvn" Apr 24 23:36:58.790087 kubelet[2565]: I0424 23:36:58.789921 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8636bb0-afe6-4f7c-8c11-499ada35dc8f-socket-dir\") pod \"csi-node-driver-kfkvn\" (UID: \"b8636bb0-afe6-4f7c-8c11-499ada35dc8f\") " pod="calico-system/csi-node-driver-kfkvn" Apr 24 23:36:58.790087 kubelet[2565]: I0424 23:36:58.789943 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/b8636bb0-afe6-4f7c-8c11-499ada35dc8f-varrun\") pod \"csi-node-driver-kfkvn\" (UID: \"b8636bb0-afe6-4f7c-8c11-499ada35dc8f\") " pod="calico-system/csi-node-driver-kfkvn" Apr 24 23:36:58.790087 kubelet[2565]: I0424 23:36:58.789958 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8636bb0-afe6-4f7c-8c11-499ada35dc8f-registration-dir\") pod \"csi-node-driver-kfkvn\" (UID: \"b8636bb0-afe6-4f7c-8c11-499ada35dc8f\") " pod="calico-system/csi-node-driver-kfkvn" Apr 24 23:36:58.790087 kubelet[2565]: I0424 23:36:58.790076 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q7lt\" (UniqueName: \"kubernetes.io/projected/b8636bb0-afe6-4f7c-8c11-499ada35dc8f-kube-api-access-9q7lt\") pod \"csi-node-driver-kfkvn\" (UID: \"b8636bb0-afe6-4f7c-8c11-499ada35dc8f\") " pod="calico-system/csi-node-driver-kfkvn" Apr 24 23:36:58.791788 kubelet[2565]: E0424 23:36:58.791487 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.791788 kubelet[2565]: W0424 23:36:58.791499 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.791788 kubelet[2565]: E0424 23:36:58.791513 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.791902 kubelet[2565]: E0424 23:36:58.791838 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.791902 kubelet[2565]: W0424 23:36:58.791848 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.791902 kubelet[2565]: E0424 23:36:58.791859 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.792104 kubelet[2565]: E0424 23:36:58.792087 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.792104 kubelet[2565]: W0424 23:36:58.792095 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.792104 kubelet[2565]: E0424 23:36:58.792103 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.792500 kubelet[2565]: E0424 23:36:58.792485 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.792500 kubelet[2565]: W0424 23:36:58.792497 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.792587 kubelet[2565]: E0424 23:36:58.792505 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.792778 kubelet[2565]: E0424 23:36:58.792760 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.792778 kubelet[2565]: W0424 23:36:58.792770 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.792778 kubelet[2565]: E0424 23:36:58.792778 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.793536 kubelet[2565]: E0424 23:36:58.793514 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.793536 kubelet[2565]: W0424 23:36:58.793535 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.793629 kubelet[2565]: E0424 23:36:58.793543 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.793838 kubelet[2565]: E0424 23:36:58.793819 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.793838 kubelet[2565]: W0424 23:36:58.793831 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.793838 kubelet[2565]: E0424 23:36:58.793837 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.794099 kubelet[2565]: E0424 23:36:58.794088 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.794099 kubelet[2565]: W0424 23:36:58.794096 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.794153 kubelet[2565]: E0424 23:36:58.794102 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.794404 kubelet[2565]: E0424 23:36:58.794387 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.794404 kubelet[2565]: W0424 23:36:58.794397 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.794404 kubelet[2565]: E0424 23:36:58.794404 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.794680 kubelet[2565]: E0424 23:36:58.794631 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.794680 kubelet[2565]: W0424 23:36:58.794638 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.794680 kubelet[2565]: E0424 23:36:58.794644 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.795115 kubelet[2565]: E0424 23:36:58.795091 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.795115 kubelet[2565]: W0424 23:36:58.795101 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.795115 kubelet[2565]: E0424 23:36:58.795107 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.795587 kubelet[2565]: E0424 23:36:58.795569 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.795587 kubelet[2565]: W0424 23:36:58.795579 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.795587 kubelet[2565]: E0424 23:36:58.795585 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.796053 kubelet[2565]: E0424 23:36:58.796034 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.796053 kubelet[2565]: W0424 23:36:58.796045 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.796053 kubelet[2565]: E0424 23:36:58.796052 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.796615 kubelet[2565]: E0424 23:36:58.796544 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.796615 kubelet[2565]: W0424 23:36:58.796555 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.796615 kubelet[2565]: E0424 23:36:58.796566 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.797572 kubelet[2565]: E0424 23:36:58.797053 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.797572 kubelet[2565]: W0424 23:36:58.797065 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.797572 kubelet[2565]: E0424 23:36:58.797075 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.797778 kubelet[2565]: E0424 23:36:58.797693 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.797778 kubelet[2565]: W0424 23:36:58.797704 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.797778 kubelet[2565]: E0424 23:36:58.797712 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.797988 kubelet[2565]: E0424 23:36:58.797978 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.798131 kubelet[2565]: W0424 23:36:58.798049 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.798131 kubelet[2565]: E0424 23:36:58.798063 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.798569 kubelet[2565]: E0424 23:36:58.798344 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.798569 kubelet[2565]: W0424 23:36:58.798353 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.798569 kubelet[2565]: E0424 23:36:58.798360 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.798786 kubelet[2565]: E0424 23:36:58.798674 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.798786 kubelet[2565]: W0424 23:36:58.798681 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.798786 kubelet[2565]: E0424 23:36:58.798688 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.799039 kubelet[2565]: E0424 23:36:58.799030 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.799106 kubelet[2565]: W0424 23:36:58.799098 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.799137 kubelet[2565]: E0424 23:36:58.799130 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.799393 kubelet[2565]: E0424 23:36:58.799384 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.799699 kubelet[2565]: W0424 23:36:58.799469 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.799699 kubelet[2565]: E0424 23:36:58.799481 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.800306 kubelet[2565]: E0424 23:36:58.800288 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.800306 kubelet[2565]: W0424 23:36:58.800300 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.800381 kubelet[2565]: E0424 23:36:58.800309 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.803591 kubelet[2565]: E0424 23:36:58.803481 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.803591 kubelet[2565]: W0424 23:36:58.803492 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.803591 kubelet[2565]: E0424 23:36:58.803503 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.803838 kubelet[2565]: E0424 23:36:58.803829 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.803964 kubelet[2565]: W0424 23:36:58.803873 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.803964 kubelet[2565]: E0424 23:36:58.803883 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.806680 kubelet[2565]: E0424 23:36:58.806669 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.806840 kubelet[2565]: W0424 23:36:58.806730 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.806840 kubelet[2565]: E0424 23:36:58.806742 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.807526 kubelet[2565]: E0424 23:36:58.807516 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.809140 kubelet[2565]: W0424 23:36:58.809126 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.809235 kubelet[2565]: E0424 23:36:58.809188 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.809807 kubelet[2565]: E0424 23:36:58.809723 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.809807 kubelet[2565]: W0424 23:36:58.809733 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.809807 kubelet[2565]: E0424 23:36:58.809741 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.810504 kubelet[2565]: E0424 23:36:58.810475 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.810668 kubelet[2565]: W0424 23:36:58.810572 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.810668 kubelet[2565]: E0424 23:36:58.810583 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.810900 kubelet[2565]: E0424 23:36:58.810811 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.810900 kubelet[2565]: W0424 23:36:58.810819 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.810900 kubelet[2565]: E0424 23:36:58.810827 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.811161 kubelet[2565]: E0424 23:36:58.811051 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.811161 kubelet[2565]: W0424 23:36:58.811060 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.811161 kubelet[2565]: E0424 23:36:58.811068 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.811301 kubelet[2565]: E0424 23:36:58.811292 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.811350 kubelet[2565]: W0424 23:36:58.811343 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.811457 kubelet[2565]: E0424 23:36:58.811378 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.811679 kubelet[2565]: E0424 23:36:58.811661 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.811679 kubelet[2565]: W0424 23:36:58.811674 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.811736 kubelet[2565]: E0424 23:36:58.811683 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.850227 containerd[1519]: time="2026-04-24T23:36:58.850177202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6c85bdfbb6-hz5tb,Uid:0aa6fa68-a301-4546-b8a8-91ae82ea93d2,Namespace:calico-system,Attempt:0,}" Apr 24 23:36:58.871303 containerd[1519]: time="2026-04-24T23:36:58.871222285Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:36:58.871303 containerd[1519]: time="2026-04-24T23:36:58.871262923Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:36:58.871303 containerd[1519]: time="2026-04-24T23:36:58.871274185Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:58.872140 containerd[1519]: time="2026-04-24T23:36:58.872107659Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:58.890563 systemd[1]: Started cri-containerd-e10e1c4eb04761713e3f1840838a980dd3cc67df8e1beca43f49414a47fe8152.scope - libcontainer container e10e1c4eb04761713e3f1840838a980dd3cc67df8e1beca43f49414a47fe8152. Apr 24 23:36:58.892040 kubelet[2565]: E0424 23:36:58.892008 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.892040 kubelet[2565]: W0424 23:36:58.892028 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.892127 kubelet[2565]: E0424 23:36:58.892045 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.892655 kubelet[2565]: E0424 23:36:58.892438 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.892655 kubelet[2565]: W0424 23:36:58.892470 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.892655 kubelet[2565]: E0424 23:36:58.892480 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.892655 kubelet[2565]: E0424 23:36:58.892663 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.893258 kubelet[2565]: W0424 23:36:58.892670 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.893258 kubelet[2565]: E0424 23:36:58.892681 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.893258 kubelet[2565]: E0424 23:36:58.892867 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.893258 kubelet[2565]: W0424 23:36:58.892875 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.893258 kubelet[2565]: E0424 23:36:58.892885 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.893258 kubelet[2565]: E0424 23:36:58.893069 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.893258 kubelet[2565]: W0424 23:36:58.893075 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.893258 kubelet[2565]: E0424 23:36:58.893082 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.893939 kubelet[2565]: E0424 23:36:58.893721 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.893939 kubelet[2565]: W0424 23:36:58.893731 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.893939 kubelet[2565]: E0424 23:36:58.893740 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.894022 kubelet[2565]: E0424 23:36:58.893994 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.894022 kubelet[2565]: W0424 23:36:58.894001 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.894022 kubelet[2565]: E0424 23:36:58.894008 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.894518 kubelet[2565]: E0424 23:36:58.894492 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.894518 kubelet[2565]: W0424 23:36:58.894512 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.894574 kubelet[2565]: E0424 23:36:58.894520 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.895582 kubelet[2565]: E0424 23:36:58.895357 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.895582 kubelet[2565]: W0424 23:36:58.895367 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.895582 kubelet[2565]: E0424 23:36:58.895375 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.895688 kubelet[2565]: E0424 23:36:58.895669 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.895688 kubelet[2565]: W0424 23:36:58.895680 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.895740 kubelet[2565]: E0424 23:36:58.895688 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.895960 kubelet[2565]: E0424 23:36:58.895921 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.895960 kubelet[2565]: W0424 23:36:58.895930 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.895960 kubelet[2565]: E0424 23:36:58.895937 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.896223 kubelet[2565]: E0424 23:36:58.896204 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.896223 kubelet[2565]: W0424 23:36:58.896216 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.896272 kubelet[2565]: E0424 23:36:58.896240 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.896490 kubelet[2565]: E0424 23:36:58.896473 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.896490 kubelet[2565]: W0424 23:36:58.896485 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.896533 kubelet[2565]: E0424 23:36:58.896501 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.896834 kubelet[2565]: E0424 23:36:58.896818 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.896834 kubelet[2565]: W0424 23:36:58.896828 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.896879 kubelet[2565]: E0424 23:36:58.896835 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.897092 kubelet[2565]: E0424 23:36:58.897073 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.897092 kubelet[2565]: W0424 23:36:58.897082 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.897092 kubelet[2565]: E0424 23:36:58.897088 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.897312 kubelet[2565]: E0424 23:36:58.897296 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.897312 kubelet[2565]: W0424 23:36:58.897306 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.897360 kubelet[2565]: E0424 23:36:58.897313 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.897629 kubelet[2565]: E0424 23:36:58.897611 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.897629 kubelet[2565]: W0424 23:36:58.897621 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.897629 kubelet[2565]: E0424 23:36:58.897628 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.897910 kubelet[2565]: E0424 23:36:58.897895 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.897910 kubelet[2565]: W0424 23:36:58.897905 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.897952 kubelet[2565]: E0424 23:36:58.897912 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.898180 kubelet[2565]: E0424 23:36:58.898162 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.898202 kubelet[2565]: W0424 23:36:58.898188 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.898202 kubelet[2565]: E0424 23:36:58.898195 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.898582 kubelet[2565]: E0424 23:36:58.898565 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.898582 kubelet[2565]: W0424 23:36:58.898579 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.898627 kubelet[2565]: E0424 23:36:58.898586 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.898966 kubelet[2565]: E0424 23:36:58.898944 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.898966 kubelet[2565]: W0424 23:36:58.898955 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.898966 kubelet[2565]: E0424 23:36:58.898962 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.899377 kubelet[2565]: E0424 23:36:58.899358 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.899377 kubelet[2565]: W0424 23:36:58.899372 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.899431 kubelet[2565]: E0424 23:36:58.899381 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.900011 kubelet[2565]: E0424 23:36:58.899993 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.900011 kubelet[2565]: W0424 23:36:58.900005 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.900059 kubelet[2565]: E0424 23:36:58.900013 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.900586 kubelet[2565]: E0424 23:36:58.900484 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.900586 kubelet[2565]: W0424 23:36:58.900494 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.900586 kubelet[2565]: E0424 23:36:58.900501 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.901525 kubelet[2565]: E0424 23:36:58.901491 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.901525 kubelet[2565]: W0424 23:36:58.901503 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.901525 kubelet[2565]: E0424 23:36:58.901515 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.910635 kubelet[2565]: E0424 23:36:58.910567 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:58.910635 kubelet[2565]: W0424 23:36:58.910582 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:58.910635 kubelet[2565]: E0424 23:36:58.910598 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:58.918342 containerd[1519]: time="2026-04-24T23:36:58.918309221Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-h74r6,Uid:122ece37-52f9-4bd5-94de-b46e4cbcc814,Namespace:calico-system,Attempt:0,}" Apr 24 23:36:58.942808 containerd[1519]: time="2026-04-24T23:36:58.942746613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6c85bdfbb6-hz5tb,Uid:0aa6fa68-a301-4546-b8a8-91ae82ea93d2,Namespace:calico-system,Attempt:0,} returns sandbox id \"e10e1c4eb04761713e3f1840838a980dd3cc67df8e1beca43f49414a47fe8152\"" Apr 24 23:36:58.945103 containerd[1519]: time="2026-04-24T23:36:58.945054677Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Apr 24 23:36:58.946253 containerd[1519]: time="2026-04-24T23:36:58.946145368Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:36:58.946253 containerd[1519]: time="2026-04-24T23:36:58.946188055Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:36:58.946253 containerd[1519]: time="2026-04-24T23:36:58.946206696Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:58.947869 containerd[1519]: time="2026-04-24T23:36:58.947100330Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:36:58.963555 systemd[1]: Started cri-containerd-6851cbeda857f9a0cb3a1669fb5671016e8e02f41f66d2981a1dd2b1b7914241.scope - libcontainer container 6851cbeda857f9a0cb3a1669fb5671016e8e02f41f66d2981a1dd2b1b7914241. Apr 24 23:36:58.982665 containerd[1519]: time="2026-04-24T23:36:58.982620207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-h74r6,Uid:122ece37-52f9-4bd5-94de-b46e4cbcc814,Namespace:calico-system,Attempt:0,} returns sandbox id \"6851cbeda857f9a0cb3a1669fb5671016e8e02f41f66d2981a1dd2b1b7914241\"" Apr 24 23:36:59.570584 kubelet[2565]: E0424 23:36:59.570541 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.570584 kubelet[2565]: W0424 23:36:59.570572 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.571291 kubelet[2565]: E0424 23:36:59.570601 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.571291 kubelet[2565]: E0424 23:36:59.571115 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.571291 kubelet[2565]: W0424 23:36:59.571130 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.571291 kubelet[2565]: E0424 23:36:59.571153 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.571705 kubelet[2565]: E0424 23:36:59.571653 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.571705 kubelet[2565]: W0424 23:36:59.571681 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.571705 kubelet[2565]: E0424 23:36:59.571707 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.572261 kubelet[2565]: E0424 23:36:59.572222 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.572261 kubelet[2565]: W0424 23:36:59.572249 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.572365 kubelet[2565]: E0424 23:36:59.572269 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.572880 kubelet[2565]: E0424 23:36:59.572841 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.572880 kubelet[2565]: W0424 23:36:59.572863 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.572880 kubelet[2565]: E0424 23:36:59.572877 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.573323 kubelet[2565]: E0424 23:36:59.573266 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.573323 kubelet[2565]: W0424 23:36:59.573284 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.573323 kubelet[2565]: E0424 23:36:59.573298 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.573705 kubelet[2565]: E0424 23:36:59.573660 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.573705 kubelet[2565]: W0424 23:36:59.573677 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.573705 kubelet[2565]: E0424 23:36:59.573692 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.574136 kubelet[2565]: E0424 23:36:59.574095 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.574136 kubelet[2565]: W0424 23:36:59.574129 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.574245 kubelet[2565]: E0424 23:36:59.574156 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.574766 kubelet[2565]: E0424 23:36:59.574730 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.574766 kubelet[2565]: W0424 23:36:59.574762 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.574885 kubelet[2565]: E0424 23:36:59.574781 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.575473 kubelet[2565]: E0424 23:36:59.575391 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.575524 kubelet[2565]: W0424 23:36:59.575472 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.575524 kubelet[2565]: E0424 23:36:59.575489 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.576089 kubelet[2565]: E0424 23:36:59.576064 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.576138 kubelet[2565]: W0424 23:36:59.576091 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.576138 kubelet[2565]: E0424 23:36:59.576112 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.576711 kubelet[2565]: E0424 23:36:59.576683 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.576787 kubelet[2565]: W0424 23:36:59.576711 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.576787 kubelet[2565]: E0424 23:36:59.576735 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.577498 kubelet[2565]: E0424 23:36:59.577372 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.577498 kubelet[2565]: W0424 23:36:59.577403 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.577498 kubelet[2565]: E0424 23:36:59.577487 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.578658 kubelet[2565]: E0424 23:36:59.578619 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.578658 kubelet[2565]: W0424 23:36:59.578655 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.578775 kubelet[2565]: E0424 23:36:59.578690 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.579643 kubelet[2565]: E0424 23:36:59.579468 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.579643 kubelet[2565]: W0424 23:36:59.579488 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.579643 kubelet[2565]: E0424 23:36:59.579505 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.580666 kubelet[2565]: E0424 23:36:59.580189 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.580666 kubelet[2565]: W0424 23:36:59.580208 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.580666 kubelet[2565]: E0424 23:36:59.580222 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.581529 kubelet[2565]: E0424 23:36:59.581489 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.581529 kubelet[2565]: W0424 23:36:59.581519 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.581623 kubelet[2565]: E0424 23:36:59.581539 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.582008 kubelet[2565]: E0424 23:36:59.581972 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.582008 kubelet[2565]: W0424 23:36:59.581994 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.582008 kubelet[2565]: E0424 23:36:59.582010 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.582696 kubelet[2565]: E0424 23:36:59.582470 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.582696 kubelet[2565]: W0424 23:36:59.582493 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.582696 kubelet[2565]: E0424 23:36:59.582510 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.583001 kubelet[2565]: E0424 23:36:59.582968 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.583001 kubelet[2565]: W0424 23:36:59.582990 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.583139 kubelet[2565]: E0424 23:36:59.583007 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.583564 kubelet[2565]: E0424 23:36:59.583521 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.583564 kubelet[2565]: W0424 23:36:59.583548 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.583564 kubelet[2565]: E0424 23:36:59.583562 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.584017 kubelet[2565]: E0424 23:36:59.583980 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.584017 kubelet[2565]: W0424 23:36:59.584005 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.584135 kubelet[2565]: E0424 23:36:59.584022 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.584537 kubelet[2565]: E0424 23:36:59.584497 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.584537 kubelet[2565]: W0424 23:36:59.584520 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.584537 kubelet[2565]: E0424 23:36:59.584533 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.584919 kubelet[2565]: E0424 23:36:59.584893 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.584919 kubelet[2565]: W0424 23:36:59.584910 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.585012 kubelet[2565]: E0424 23:36:59.584925 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:36:59.585595 kubelet[2565]: E0424 23:36:59.585401 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:36:59.585595 kubelet[2565]: W0424 23:36:59.585433 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:36:59.585595 kubelet[2565]: E0424 23:36:59.585509 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.393044 kubelet[2565]: E0424 23:37:00.392678 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.393044 kubelet[2565]: W0424 23:37:00.392707 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.393044 kubelet[2565]: E0424 23:37:00.392736 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.394034 kubelet[2565]: E0424 23:37:00.393804 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.394034 kubelet[2565]: W0424 23:37:00.393852 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.394034 kubelet[2565]: E0424 23:37:00.393915 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.396706 kubelet[2565]: E0424 23:37:00.396363 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.396706 kubelet[2565]: W0424 23:37:00.396455 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.396706 kubelet[2565]: E0424 23:37:00.396477 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.397691 kubelet[2565]: E0424 23:37:00.397357 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.397691 kubelet[2565]: W0424 23:37:00.397374 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.397691 kubelet[2565]: E0424 23:37:00.397390 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.398198 kubelet[2565]: E0424 23:37:00.398079 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.398198 kubelet[2565]: W0424 23:37:00.398095 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.398198 kubelet[2565]: E0424 23:37:00.398109 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.476067 kubelet[2565]: E0424 23:37:00.475994 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kfkvn" podUID="b8636bb0-afe6-4f7c-8c11-499ada35dc8f" Apr 24 23:37:00.897136 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1002782059.mount: Deactivated successfully. Apr 24 23:37:00.901962 kubelet[2565]: E0424 23:37:00.901919 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.901962 kubelet[2565]: W0424 23:37:00.901948 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.903224 kubelet[2565]: E0424 23:37:00.901970 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.903224 kubelet[2565]: E0424 23:37:00.902438 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.903224 kubelet[2565]: W0424 23:37:00.902446 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.903224 kubelet[2565]: E0424 23:37:00.902454 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.903224 kubelet[2565]: E0424 23:37:00.902783 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.903224 kubelet[2565]: W0424 23:37:00.902790 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.903224 kubelet[2565]: E0424 23:37:00.902797 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.903224 kubelet[2565]: E0424 23:37:00.903000 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.903224 kubelet[2565]: W0424 23:37:00.903006 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.903224 kubelet[2565]: E0424 23:37:00.903012 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.903390 kubelet[2565]: E0424 23:37:00.903215 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.903390 kubelet[2565]: W0424 23:37:00.903230 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.903390 kubelet[2565]: E0424 23:37:00.903237 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.903459 kubelet[2565]: E0424 23:37:00.903428 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.903459 kubelet[2565]: W0424 23:37:00.903435 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.903459 kubelet[2565]: E0424 23:37:00.903442 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.903651 kubelet[2565]: E0424 23:37:00.903634 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.903651 kubelet[2565]: W0424 23:37:00.903646 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.903695 kubelet[2565]: E0424 23:37:00.903653 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.903868 kubelet[2565]: E0424 23:37:00.903850 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.903868 kubelet[2565]: W0424 23:37:00.903862 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.903868 kubelet[2565]: E0424 23:37:00.903868 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.904124 kubelet[2565]: E0424 23:37:00.904107 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.904124 kubelet[2565]: W0424 23:37:00.904118 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.904124 kubelet[2565]: E0424 23:37:00.904124 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.904477 kubelet[2565]: E0424 23:37:00.904383 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.904477 kubelet[2565]: W0424 23:37:00.904394 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.904477 kubelet[2565]: E0424 23:37:00.904400 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.904662 kubelet[2565]: E0424 23:37:00.904646 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.904662 kubelet[2565]: W0424 23:37:00.904657 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.904707 kubelet[2565]: E0424 23:37:00.904665 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.904923 kubelet[2565]: E0424 23:37:00.904911 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.904923 kubelet[2565]: W0424 23:37:00.904917 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.904923 kubelet[2565]: E0424 23:37:00.904923 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.905237 kubelet[2565]: E0424 23:37:00.905219 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.905237 kubelet[2565]: W0424 23:37:00.905231 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.905237 kubelet[2565]: E0424 23:37:00.905238 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.905574 kubelet[2565]: E0424 23:37:00.905518 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.905574 kubelet[2565]: W0424 23:37:00.905554 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.905574 kubelet[2565]: E0424 23:37:00.905560 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:00.905890 kubelet[2565]: E0424 23:37:00.905875 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:00.905890 kubelet[2565]: W0424 23:37:00.905886 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:00.905928 kubelet[2565]: E0424 23:37:00.905893 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.297813 containerd[1519]: time="2026-04-24T23:37:01.297762943Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:01.298989 containerd[1519]: time="2026-04-24T23:37:01.298888004Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Apr 24 23:37:01.300429 containerd[1519]: time="2026-04-24T23:37:01.299975471Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:01.302225 containerd[1519]: time="2026-04-24T23:37:01.302198745Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:01.303311 containerd[1519]: time="2026-04-24T23:37:01.303057562Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 2.357979745s" Apr 24 23:37:01.303311 containerd[1519]: time="2026-04-24T23:37:01.303087172Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Apr 24 23:37:01.305224 containerd[1519]: time="2026-04-24T23:37:01.305202928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Apr 24 23:37:01.314705 containerd[1519]: time="2026-04-24T23:37:01.314670821Z" level=info msg="CreateContainer within sandbox \"e10e1c4eb04761713e3f1840838a980dd3cc67df8e1beca43f49414a47fe8152\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Apr 24 23:37:01.327348 containerd[1519]: time="2026-04-24T23:37:01.327308858Z" level=info msg="CreateContainer within sandbox \"e10e1c4eb04761713e3f1840838a980dd3cc67df8e1beca43f49414a47fe8152\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f8ddc02bd1ba382c6de99dadd16e4a44984bdcbeeb7aac5212b01a08eab51aa9\"" Apr 24 23:37:01.328449 containerd[1519]: time="2026-04-24T23:37:01.327712116Z" level=info msg="StartContainer for \"f8ddc02bd1ba382c6de99dadd16e4a44984bdcbeeb7aac5212b01a08eab51aa9\"" Apr 24 23:37:01.355581 systemd[1]: Started cri-containerd-f8ddc02bd1ba382c6de99dadd16e4a44984bdcbeeb7aac5212b01a08eab51aa9.scope - libcontainer container f8ddc02bd1ba382c6de99dadd16e4a44984bdcbeeb7aac5212b01a08eab51aa9. Apr 24 23:37:01.392043 containerd[1519]: time="2026-04-24T23:37:01.392003616Z" level=info msg="StartContainer for \"f8ddc02bd1ba382c6de99dadd16e4a44984bdcbeeb7aac5212b01a08eab51aa9\" returns successfully" Apr 24 23:37:01.582185 kubelet[2565]: I0424 23:37:01.582062 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-6c85bdfbb6-hz5tb" podStartSLOduration=1.223202583 podStartE2EDuration="3.582053206s" podCreationTimestamp="2026-04-24 23:36:58 +0000 UTC" firstStartedPulling="2026-04-24 23:36:58.944856553 +0000 UTC m=+17.556915359" lastFinishedPulling="2026-04-24 23:37:01.303707177 +0000 UTC m=+19.915765982" observedRunningTime="2026-04-24 23:37:01.581790556 +0000 UTC m=+20.193849350" watchObservedRunningTime="2026-04-24 23:37:01.582053206 +0000 UTC m=+20.194112000" Apr 24 23:37:01.609834 kubelet[2565]: E0424 23:37:01.609697 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.609834 kubelet[2565]: W0424 23:37:01.609721 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.609834 kubelet[2565]: E0424 23:37:01.609743 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.611308 kubelet[2565]: E0424 23:37:01.610650 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.611308 kubelet[2565]: W0424 23:37:01.610661 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.611308 kubelet[2565]: E0424 23:37:01.610674 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.611308 kubelet[2565]: E0424 23:37:01.611066 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.611308 kubelet[2565]: W0424 23:37:01.611075 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.611308 kubelet[2565]: E0424 23:37:01.611084 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.613749 kubelet[2565]: E0424 23:37:01.613735 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.613826 kubelet[2565]: W0424 23:37:01.613818 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.613862 kubelet[2565]: E0424 23:37:01.613854 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.614600 kubelet[2565]: E0424 23:37:01.614589 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.614649 kubelet[2565]: W0424 23:37:01.614642 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.614682 kubelet[2565]: E0424 23:37:01.614674 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.614967 kubelet[2565]: E0424 23:37:01.614958 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.615008 kubelet[2565]: W0424 23:37:01.615001 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.615046 kubelet[2565]: E0424 23:37:01.615039 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.616245 kubelet[2565]: E0424 23:37:01.616145 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.616245 kubelet[2565]: W0424 23:37:01.616157 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.616245 kubelet[2565]: E0424 23:37:01.616166 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.616606 kubelet[2565]: E0424 23:37:01.616562 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.616606 kubelet[2565]: W0424 23:37:01.616572 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.616606 kubelet[2565]: E0424 23:37:01.616582 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.617059 kubelet[2565]: E0424 23:37:01.617008 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.617059 kubelet[2565]: W0424 23:37:01.617016 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.617059 kubelet[2565]: E0424 23:37:01.617024 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.617452 kubelet[2565]: E0424 23:37:01.617315 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.617452 kubelet[2565]: W0424 23:37:01.617323 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.617452 kubelet[2565]: E0424 23:37:01.617330 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.618075 kubelet[2565]: E0424 23:37:01.617760 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.618075 kubelet[2565]: W0424 23:37:01.617770 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.618075 kubelet[2565]: E0424 23:37:01.617780 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.618527 kubelet[2565]: E0424 23:37:01.618312 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.618527 kubelet[2565]: W0424 23:37:01.618322 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.618527 kubelet[2565]: E0424 23:37:01.618330 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.618645 kubelet[2565]: E0424 23:37:01.618637 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.618685 kubelet[2565]: W0424 23:37:01.618677 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.618741 kubelet[2565]: E0424 23:37:01.618707 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.618965 kubelet[2565]: E0424 23:37:01.618955 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.619050 kubelet[2565]: W0424 23:37:01.619003 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.619050 kubelet[2565]: E0424 23:37:01.619013 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.619324 kubelet[2565]: E0424 23:37:01.619261 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.619324 kubelet[2565]: W0424 23:37:01.619269 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.619324 kubelet[2565]: E0424 23:37:01.619276 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.717302 kubelet[2565]: E0424 23:37:01.717265 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.717302 kubelet[2565]: W0424 23:37:01.717282 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.717302 kubelet[2565]: E0424 23:37:01.717299 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.717633 kubelet[2565]: E0424 23:37:01.717599 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.717633 kubelet[2565]: W0424 23:37:01.717617 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.717633 kubelet[2565]: E0424 23:37:01.717635 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.717942 kubelet[2565]: E0424 23:37:01.717926 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.717942 kubelet[2565]: W0424 23:37:01.717937 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.718007 kubelet[2565]: E0424 23:37:01.717944 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.718263 kubelet[2565]: E0424 23:37:01.718241 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.718263 kubelet[2565]: W0424 23:37:01.718255 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.718263 kubelet[2565]: E0424 23:37:01.718263 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.718553 kubelet[2565]: E0424 23:37:01.718527 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.718553 kubelet[2565]: W0424 23:37:01.718534 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.718553 kubelet[2565]: E0424 23:37:01.718541 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.718826 kubelet[2565]: E0424 23:37:01.718796 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.718826 kubelet[2565]: W0424 23:37:01.718815 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.718826 kubelet[2565]: E0424 23:37:01.718822 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.719057 kubelet[2565]: E0424 23:37:01.719035 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.719057 kubelet[2565]: W0424 23:37:01.719046 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.719057 kubelet[2565]: E0424 23:37:01.719054 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.719314 kubelet[2565]: E0424 23:37:01.719300 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.719314 kubelet[2565]: W0424 23:37:01.719310 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.719363 kubelet[2565]: E0424 23:37:01.719317 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.719590 kubelet[2565]: E0424 23:37:01.719569 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.719590 kubelet[2565]: W0424 23:37:01.719581 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.719590 kubelet[2565]: E0424 23:37:01.719588 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.719987 kubelet[2565]: E0424 23:37:01.719962 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.719987 kubelet[2565]: W0424 23:37:01.719977 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.719987 kubelet[2565]: E0424 23:37:01.719990 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.720202 kubelet[2565]: E0424 23:37:01.720181 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.720202 kubelet[2565]: W0424 23:37:01.720196 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.720255 kubelet[2565]: E0424 23:37:01.720204 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.720539 kubelet[2565]: E0424 23:37:01.720518 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.720539 kubelet[2565]: W0424 23:37:01.720531 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.720539 kubelet[2565]: E0424 23:37:01.720539 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.721017 kubelet[2565]: E0424 23:37:01.720910 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.721017 kubelet[2565]: W0424 23:37:01.720923 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.721017 kubelet[2565]: E0424 23:37:01.720933 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.721155 kubelet[2565]: E0424 23:37:01.721142 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.721193 kubelet[2565]: W0424 23:37:01.721185 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.721223 kubelet[2565]: E0424 23:37:01.721216 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.721590 kubelet[2565]: E0424 23:37:01.721497 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.721590 kubelet[2565]: W0424 23:37:01.721505 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.721590 kubelet[2565]: E0424 23:37:01.721512 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.721834 kubelet[2565]: E0424 23:37:01.721718 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.721834 kubelet[2565]: W0424 23:37:01.721726 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.721834 kubelet[2565]: E0424 23:37:01.721733 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.721978 kubelet[2565]: E0424 23:37:01.721968 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.722323 kubelet[2565]: W0424 23:37:01.722009 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.722323 kubelet[2565]: E0424 23:37:01.722017 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.722513 kubelet[2565]: E0424 23:37:01.722492 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:01.722513 kubelet[2565]: W0424 23:37:01.722503 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:01.722513 kubelet[2565]: E0424 23:37:01.722510 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:01.880559 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount847250129.mount: Deactivated successfully. Apr 24 23:37:02.148077 update_engine[1503]: I20260424 23:37:02.147872 1503 update_attempter.cc:509] Updating boot flags... Apr 24 23:37:02.232489 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 31 scanned by (udev-worker) (3267) Apr 24 23:37:02.295451 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 31 scanned by (udev-worker) (3266) Apr 24 23:37:02.355506 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 31 scanned by (udev-worker) (3266) Apr 24 23:37:02.476756 kubelet[2565]: E0424 23:37:02.476678 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kfkvn" podUID="b8636bb0-afe6-4f7c-8c11-499ada35dc8f" Apr 24 23:37:02.623543 kubelet[2565]: E0424 23:37:02.623470 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.623852 kubelet[2565]: W0424 23:37:02.623504 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.623852 kubelet[2565]: E0424 23:37:02.623603 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.624121 kubelet[2565]: E0424 23:37:02.624095 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.624121 kubelet[2565]: W0424 23:37:02.624111 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.624121 kubelet[2565]: E0424 23:37:02.624128 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.624705 kubelet[2565]: E0424 23:37:02.624644 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.624705 kubelet[2565]: W0424 23:37:02.624673 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.624705 kubelet[2565]: E0424 23:37:02.624697 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.625307 kubelet[2565]: E0424 23:37:02.625222 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.625307 kubelet[2565]: W0424 23:37:02.625271 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.625307 kubelet[2565]: E0424 23:37:02.625289 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.625816 kubelet[2565]: E0424 23:37:02.625791 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.625816 kubelet[2565]: W0424 23:37:02.625810 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.625949 kubelet[2565]: E0424 23:37:02.625827 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.626339 kubelet[2565]: E0424 23:37:02.626288 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.626339 kubelet[2565]: W0424 23:37:02.626312 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.626339 kubelet[2565]: E0424 23:37:02.626328 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.627361 kubelet[2565]: E0424 23:37:02.626961 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.627361 kubelet[2565]: W0424 23:37:02.626983 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.627361 kubelet[2565]: E0424 23:37:02.627000 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.627713 kubelet[2565]: E0424 23:37:02.627684 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.627713 kubelet[2565]: W0424 23:37:02.627708 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.627883 kubelet[2565]: E0424 23:37:02.627726 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.628372 kubelet[2565]: E0424 23:37:02.628317 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.628372 kubelet[2565]: W0424 23:37:02.628347 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.628372 kubelet[2565]: E0424 23:37:02.628365 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.628977 kubelet[2565]: E0424 23:37:02.628941 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.628977 kubelet[2565]: W0424 23:37:02.628969 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.629139 kubelet[2565]: E0424 23:37:02.628985 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.629532 kubelet[2565]: E0424 23:37:02.629492 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.629532 kubelet[2565]: W0424 23:37:02.629518 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.629673 kubelet[2565]: E0424 23:37:02.629538 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.630026 kubelet[2565]: E0424 23:37:02.629983 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.630026 kubelet[2565]: W0424 23:37:02.630008 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.630026 kubelet[2565]: E0424 23:37:02.630026 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.630578 kubelet[2565]: E0424 23:37:02.630548 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.630578 kubelet[2565]: W0424 23:37:02.630573 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.630708 kubelet[2565]: E0424 23:37:02.630592 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.631046 kubelet[2565]: E0424 23:37:02.631017 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.631046 kubelet[2565]: W0424 23:37:02.631041 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.631167 kubelet[2565]: E0424 23:37:02.631058 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.631574 kubelet[2565]: E0424 23:37:02.631551 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.631735 kubelet[2565]: W0424 23:37:02.631689 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.631735 kubelet[2565]: E0424 23:37:02.631732 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.632645 kubelet[2565]: E0424 23:37:02.632602 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.632645 kubelet[2565]: W0424 23:37:02.632634 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.632773 kubelet[2565]: E0424 23:37:02.632658 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.633558 kubelet[2565]: E0424 23:37:02.633216 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.633558 kubelet[2565]: W0424 23:37:02.633239 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.633558 kubelet[2565]: E0424 23:37:02.633258 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.633930 kubelet[2565]: E0424 23:37:02.633891 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.633930 kubelet[2565]: W0424 23:37:02.633916 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.633930 kubelet[2565]: E0424 23:37:02.633933 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.634815 kubelet[2565]: E0424 23:37:02.634762 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.634815 kubelet[2565]: W0424 23:37:02.634800 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.634978 kubelet[2565]: E0424 23:37:02.634824 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.635659 kubelet[2565]: E0424 23:37:02.635385 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.635659 kubelet[2565]: W0424 23:37:02.635410 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.635659 kubelet[2565]: E0424 23:37:02.635489 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.636060 kubelet[2565]: E0424 23:37:02.636030 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.636184 kubelet[2565]: W0424 23:37:02.636137 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.636184 kubelet[2565]: E0424 23:37:02.636167 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.637011 kubelet[2565]: E0424 23:37:02.636739 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.637011 kubelet[2565]: W0424 23:37:02.636765 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.637011 kubelet[2565]: E0424 23:37:02.636786 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.638039 kubelet[2565]: E0424 23:37:02.637780 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.638039 kubelet[2565]: W0424 23:37:02.637806 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.638039 kubelet[2565]: E0424 23:37:02.637829 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.638944 kubelet[2565]: E0424 23:37:02.638697 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.638944 kubelet[2565]: W0424 23:37:02.638723 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.638944 kubelet[2565]: E0424 23:37:02.638745 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.639963 kubelet[2565]: E0424 23:37:02.639688 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.639963 kubelet[2565]: W0424 23:37:02.639716 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.639963 kubelet[2565]: E0424 23:37:02.639742 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.642795 kubelet[2565]: E0424 23:37:02.642768 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.642954 kubelet[2565]: W0424 23:37:02.642927 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.643319 kubelet[2565]: E0424 23:37:02.643082 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.643899 kubelet[2565]: E0424 23:37:02.643873 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.644338 kubelet[2565]: W0424 23:37:02.644044 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.644338 kubelet[2565]: E0424 23:37:02.644079 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.645665 kubelet[2565]: E0424 23:37:02.645618 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.646346 kubelet[2565]: W0424 23:37:02.645787 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.646346 kubelet[2565]: E0424 23:37:02.645816 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.647330 kubelet[2565]: E0424 23:37:02.647302 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.647722 kubelet[2565]: W0424 23:37:02.647597 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.647722 kubelet[2565]: E0424 23:37:02.647622 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.649127 kubelet[2565]: E0424 23:37:02.649102 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.650478 kubelet[2565]: W0424 23:37:02.649317 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.650478 kubelet[2565]: E0424 23:37:02.649361 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.651261 kubelet[2565]: E0424 23:37:02.651223 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.651261 kubelet[2565]: W0424 23:37:02.651254 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.651387 kubelet[2565]: E0424 23:37:02.651315 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.653178 kubelet[2565]: E0424 23:37:02.653130 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.653253 kubelet[2565]: W0424 23:37:02.653160 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.653253 kubelet[2565]: E0424 23:37:02.653220 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.654814 kubelet[2565]: E0424 23:37:02.654511 2565 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:37:02.655189 kubelet[2565]: W0424 23:37:02.655040 2565 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:37:02.655189 kubelet[2565]: E0424 23:37:02.655077 2565 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:37:02.994491 containerd[1519]: time="2026-04-24T23:37:02.994442194Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:02.995790 containerd[1519]: time="2026-04-24T23:37:02.995747563Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Apr 24 23:37:02.997027 containerd[1519]: time="2026-04-24T23:37:02.996994699Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:02.999085 containerd[1519]: time="2026-04-24T23:37:02.999035373Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:02.999576 containerd[1519]: time="2026-04-24T23:37:02.999539286Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.694257895s" Apr 24 23:37:02.999604 containerd[1519]: time="2026-04-24T23:37:02.999577578Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Apr 24 23:37:03.004089 containerd[1519]: time="2026-04-24T23:37:03.003850154Z" level=info msg="CreateContainer within sandbox \"6851cbeda857f9a0cb3a1669fb5671016e8e02f41f66d2981a1dd2b1b7914241\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Apr 24 23:37:03.020319 containerd[1519]: time="2026-04-24T23:37:03.020275238Z" level=info msg="CreateContainer within sandbox \"6851cbeda857f9a0cb3a1669fb5671016e8e02f41f66d2981a1dd2b1b7914241\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"73086c7745e0681299468086b2b1f1b12185718d0141e1c37b0a6bb15f35cfc2\"" Apr 24 23:37:03.021484 containerd[1519]: time="2026-04-24T23:37:03.020742416Z" level=info msg="StartContainer for \"73086c7745e0681299468086b2b1f1b12185718d0141e1c37b0a6bb15f35cfc2\"" Apr 24 23:37:03.051551 systemd[1]: Started cri-containerd-73086c7745e0681299468086b2b1f1b12185718d0141e1c37b0a6bb15f35cfc2.scope - libcontainer container 73086c7745e0681299468086b2b1f1b12185718d0141e1c37b0a6bb15f35cfc2. Apr 24 23:37:03.078521 containerd[1519]: time="2026-04-24T23:37:03.078404718Z" level=info msg="StartContainer for \"73086c7745e0681299468086b2b1f1b12185718d0141e1c37b0a6bb15f35cfc2\" returns successfully" Apr 24 23:37:03.088419 systemd[1]: cri-containerd-73086c7745e0681299468086b2b1f1b12185718d0141e1c37b0a6bb15f35cfc2.scope: Deactivated successfully. Apr 24 23:37:03.110771 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-73086c7745e0681299468086b2b1f1b12185718d0141e1c37b0a6bb15f35cfc2-rootfs.mount: Deactivated successfully. Apr 24 23:37:03.188767 containerd[1519]: time="2026-04-24T23:37:03.188687312Z" level=info msg="shim disconnected" id=73086c7745e0681299468086b2b1f1b12185718d0141e1c37b0a6bb15f35cfc2 namespace=k8s.io Apr 24 23:37:03.188767 containerd[1519]: time="2026-04-24T23:37:03.188754388Z" level=warning msg="cleaning up after shim disconnected" id=73086c7745e0681299468086b2b1f1b12185718d0141e1c37b0a6bb15f35cfc2 namespace=k8s.io Apr 24 23:37:03.188767 containerd[1519]: time="2026-04-24T23:37:03.188761504Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:37:03.583881 containerd[1519]: time="2026-04-24T23:37:03.583803736Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Apr 24 23:37:04.476441 kubelet[2565]: E0424 23:37:04.476319 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kfkvn" podUID="b8636bb0-afe6-4f7c-8c11-499ada35dc8f" Apr 24 23:37:06.475838 kubelet[2565]: E0424 23:37:06.475798 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kfkvn" podUID="b8636bb0-afe6-4f7c-8c11-499ada35dc8f" Apr 24 23:37:07.848329 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount153874848.mount: Deactivated successfully. Apr 24 23:37:07.876775 containerd[1519]: time="2026-04-24T23:37:07.876716257Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:07.877534 containerd[1519]: time="2026-04-24T23:37:07.877419739Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Apr 24 23:37:07.878925 containerd[1519]: time="2026-04-24T23:37:07.878045307Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:07.880010 containerd[1519]: time="2026-04-24T23:37:07.879394134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:07.880010 containerd[1519]: time="2026-04-24T23:37:07.879909857Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 4.296054536s" Apr 24 23:37:07.880010 containerd[1519]: time="2026-04-24T23:37:07.879939193Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Apr 24 23:37:07.883448 containerd[1519]: time="2026-04-24T23:37:07.883401032Z" level=info msg="CreateContainer within sandbox \"6851cbeda857f9a0cb3a1669fb5671016e8e02f41f66d2981a1dd2b1b7914241\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Apr 24 23:37:07.900353 containerd[1519]: time="2026-04-24T23:37:07.900311092Z" level=info msg="CreateContainer within sandbox \"6851cbeda857f9a0cb3a1669fb5671016e8e02f41f66d2981a1dd2b1b7914241\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"a507f89087b94ba393d4cd4e6bcf0d41b48b580c2e7634901707813c7f88e8f4\"" Apr 24 23:37:07.900981 containerd[1519]: time="2026-04-24T23:37:07.900964087Z" level=info msg="StartContainer for \"a507f89087b94ba393d4cd4e6bcf0d41b48b580c2e7634901707813c7f88e8f4\"" Apr 24 23:37:07.927766 systemd[1]: run-containerd-runc-k8s.io-a507f89087b94ba393d4cd4e6bcf0d41b48b580c2e7634901707813c7f88e8f4-runc.loxF8f.mount: Deactivated successfully. Apr 24 23:37:07.936581 systemd[1]: Started cri-containerd-a507f89087b94ba393d4cd4e6bcf0d41b48b580c2e7634901707813c7f88e8f4.scope - libcontainer container a507f89087b94ba393d4cd4e6bcf0d41b48b580c2e7634901707813c7f88e8f4. Apr 24 23:37:07.964872 containerd[1519]: time="2026-04-24T23:37:07.964824047Z" level=info msg="StartContainer for \"a507f89087b94ba393d4cd4e6bcf0d41b48b580c2e7634901707813c7f88e8f4\" returns successfully" Apr 24 23:37:08.002689 systemd[1]: cri-containerd-a507f89087b94ba393d4cd4e6bcf0d41b48b580c2e7634901707813c7f88e8f4.scope: Deactivated successfully. Apr 24 23:37:08.065762 containerd[1519]: time="2026-04-24T23:37:08.065698936Z" level=info msg="shim disconnected" id=a507f89087b94ba393d4cd4e6bcf0d41b48b580c2e7634901707813c7f88e8f4 namespace=k8s.io Apr 24 23:37:08.065762 containerd[1519]: time="2026-04-24T23:37:08.065752728Z" level=warning msg="cleaning up after shim disconnected" id=a507f89087b94ba393d4cd4e6bcf0d41b48b580c2e7634901707813c7f88e8f4 namespace=k8s.io Apr 24 23:37:08.065762 containerd[1519]: time="2026-04-24T23:37:08.065759994Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:37:08.476199 kubelet[2565]: E0424 23:37:08.476110 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kfkvn" podUID="b8636bb0-afe6-4f7c-8c11-499ada35dc8f" Apr 24 23:37:08.598479 containerd[1519]: time="2026-04-24T23:37:08.597665118Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Apr 24 23:37:08.851642 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a507f89087b94ba393d4cd4e6bcf0d41b48b580c2e7634901707813c7f88e8f4-rootfs.mount: Deactivated successfully. Apr 24 23:37:10.476129 kubelet[2565]: E0424 23:37:10.476093 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kfkvn" podUID="b8636bb0-afe6-4f7c-8c11-499ada35dc8f" Apr 24 23:37:11.167855 containerd[1519]: time="2026-04-24T23:37:11.167805640Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:11.168822 containerd[1519]: time="2026-04-24T23:37:11.168777983Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Apr 24 23:37:11.169638 containerd[1519]: time="2026-04-24T23:37:11.169604442Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:11.171455 containerd[1519]: time="2026-04-24T23:37:11.171418982Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:11.172133 containerd[1519]: time="2026-04-24T23:37:11.171937185Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 2.574221171s" Apr 24 23:37:11.172133 containerd[1519]: time="2026-04-24T23:37:11.171974102Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Apr 24 23:37:11.177572 containerd[1519]: time="2026-04-24T23:37:11.177540244Z" level=info msg="CreateContainer within sandbox \"6851cbeda857f9a0cb3a1669fb5671016e8e02f41f66d2981a1dd2b1b7914241\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Apr 24 23:37:11.192213 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount848599785.mount: Deactivated successfully. Apr 24 23:37:11.192756 containerd[1519]: time="2026-04-24T23:37:11.192695032Z" level=info msg="CreateContainer within sandbox \"6851cbeda857f9a0cb3a1669fb5671016e8e02f41f66d2981a1dd2b1b7914241\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"122354f85186b4272846b30f2577d74eff5747b7932011aca04c0900adb51444\"" Apr 24 23:37:11.194578 containerd[1519]: time="2026-04-24T23:37:11.193573913Z" level=info msg="StartContainer for \"122354f85186b4272846b30f2577d74eff5747b7932011aca04c0900adb51444\"" Apr 24 23:37:11.223033 systemd[1]: run-containerd-runc-k8s.io-122354f85186b4272846b30f2577d74eff5747b7932011aca04c0900adb51444-runc.nxZ2KN.mount: Deactivated successfully. Apr 24 23:37:11.227581 systemd[1]: Started cri-containerd-122354f85186b4272846b30f2577d74eff5747b7932011aca04c0900adb51444.scope - libcontainer container 122354f85186b4272846b30f2577d74eff5747b7932011aca04c0900adb51444. Apr 24 23:37:11.261680 containerd[1519]: time="2026-04-24T23:37:11.261566503Z" level=info msg="StartContainer for \"122354f85186b4272846b30f2577d74eff5747b7932011aca04c0900adb51444\" returns successfully" Apr 24 23:37:11.753386 containerd[1519]: time="2026-04-24T23:37:11.753349398Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 24 23:37:11.755988 systemd[1]: cri-containerd-122354f85186b4272846b30f2577d74eff5747b7932011aca04c0900adb51444.scope: Deactivated successfully. Apr 24 23:37:11.776130 containerd[1519]: time="2026-04-24T23:37:11.776062429Z" level=info msg="shim disconnected" id=122354f85186b4272846b30f2577d74eff5747b7932011aca04c0900adb51444 namespace=k8s.io Apr 24 23:37:11.776130 containerd[1519]: time="2026-04-24T23:37:11.776118664Z" level=warning msg="cleaning up after shim disconnected" id=122354f85186b4272846b30f2577d74eff5747b7932011aca04c0900adb51444 namespace=k8s.io Apr 24 23:37:11.776130 containerd[1519]: time="2026-04-24T23:37:11.776126100Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:37:11.837267 kubelet[2565]: I0424 23:37:11.837211 2565 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Apr 24 23:37:11.879991 systemd[1]: Created slice kubepods-burstable-pod9e14ced5_8d3f_4116_8156_faf06229fa0d.slice - libcontainer container kubepods-burstable-pod9e14ced5_8d3f_4116_8156_faf06229fa0d.slice. Apr 24 23:37:11.887713 systemd[1]: Created slice kubepods-besteffort-pod7bc41b28_ad74_4dc3_b94f_b24620d3a9ff.slice - libcontainer container kubepods-besteffort-pod7bc41b28_ad74_4dc3_b94f_b24620d3a9ff.slice. Apr 24 23:37:11.896803 systemd[1]: Created slice kubepods-burstable-poddf938cae_ab41_409b_8afe_e6b3147b7b45.slice - libcontainer container kubepods-burstable-poddf938cae_ab41_409b_8afe_e6b3147b7b45.slice. Apr 24 23:37:11.900544 kubelet[2565]: I0424 23:37:11.900284 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4s4q\" (UniqueName: \"kubernetes.io/projected/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-kube-api-access-t4s4q\") pod \"whisker-747cf497b9-tkkg6\" (UID: \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\") " pod="calico-system/whisker-747cf497b9-tkkg6" Apr 24 23:37:11.900709 kubelet[2565]: I0424 23:37:11.900480 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df938cae-ab41-409b-8afe-e6b3147b7b45-config-volume\") pod \"coredns-7d764666f9-rvpdv\" (UID: \"df938cae-ab41-409b-8afe-e6b3147b7b45\") " pod="kube-system/coredns-7d764666f9-rvpdv" Apr 24 23:37:11.900709 kubelet[2565]: I0424 23:37:11.900671 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/133141ef-2c84-4659-9f62-e85180cc6f03-goldmane-key-pair\") pod \"goldmane-9f7667bb8-j6dz5\" (UID: \"133141ef-2c84-4659-9f62-e85180cc6f03\") " pod="calico-system/goldmane-9f7667bb8-j6dz5" Apr 24 23:37:11.900946 kubelet[2565]: I0424 23:37:11.900873 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e14ced5-8d3f-4116-8156-faf06229fa0d-config-volume\") pod \"coredns-7d764666f9-dr55n\" (UID: \"9e14ced5-8d3f-4116-8156-faf06229fa0d\") " pod="kube-system/coredns-7d764666f9-dr55n" Apr 24 23:37:11.900946 kubelet[2565]: I0424 23:37:11.900928 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/133141ef-2c84-4659-9f62-e85180cc6f03-config\") pod \"goldmane-9f7667bb8-j6dz5\" (UID: \"133141ef-2c84-4659-9f62-e85180cc6f03\") " pod="calico-system/goldmane-9f7667bb8-j6dz5" Apr 24 23:37:11.901079 kubelet[2565]: I0424 23:37:11.901041 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-whisker-backend-key-pair\") pod \"whisker-747cf497b9-tkkg6\" (UID: \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\") " pod="calico-system/whisker-747cf497b9-tkkg6" Apr 24 23:37:11.901195 kubelet[2565]: I0424 23:37:11.901118 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-whisker-ca-bundle\") pod \"whisker-747cf497b9-tkkg6\" (UID: \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\") " pod="calico-system/whisker-747cf497b9-tkkg6" Apr 24 23:37:11.901195 kubelet[2565]: I0424 23:37:11.901160 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/31b59c84-3007-40dc-82d8-2444bb185840-calico-apiserver-certs\") pod \"calico-apiserver-b8b79f554-9kfjb\" (UID: \"31b59c84-3007-40dc-82d8-2444bb185840\") " pod="calico-system/calico-apiserver-b8b79f554-9kfjb" Apr 24 23:37:11.901259 kubelet[2565]: I0424 23:37:11.901175 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4cqt\" (UniqueName: \"kubernetes.io/projected/31b59c84-3007-40dc-82d8-2444bb185840-kube-api-access-s4cqt\") pod \"calico-apiserver-b8b79f554-9kfjb\" (UID: \"31b59c84-3007-40dc-82d8-2444bb185840\") " pod="calico-system/calico-apiserver-b8b79f554-9kfjb" Apr 24 23:37:11.901329 kubelet[2565]: I0424 23:37:11.901310 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/133141ef-2c84-4659-9f62-e85180cc6f03-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-j6dz5\" (UID: \"133141ef-2c84-4659-9f62-e85180cc6f03\") " pod="calico-system/goldmane-9f7667bb8-j6dz5" Apr 24 23:37:11.901439 kubelet[2565]: I0424 23:37:11.901372 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-nginx-config\") pod \"whisker-747cf497b9-tkkg6\" (UID: \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\") " pod="calico-system/whisker-747cf497b9-tkkg6" Apr 24 23:37:11.901439 kubelet[2565]: I0424 23:37:11.901385 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6-tigera-ca-bundle\") pod \"calico-kube-controllers-5c6df7864b-wqqbk\" (UID: \"7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6\") " pod="calico-system/calico-kube-controllers-5c6df7864b-wqqbk" Apr 24 23:37:11.901586 kubelet[2565]: I0424 23:37:11.901404 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7bc41b28-ad74-4dc3-b94f-b24620d3a9ff-calico-apiserver-certs\") pod \"calico-apiserver-b8b79f554-vhwl6\" (UID: \"7bc41b28-ad74-4dc3-b94f-b24620d3a9ff\") " pod="calico-system/calico-apiserver-b8b79f554-vhwl6" Apr 24 23:37:11.901586 kubelet[2565]: I0424 23:37:11.901546 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcnkr\" (UniqueName: \"kubernetes.io/projected/7bc41b28-ad74-4dc3-b94f-b24620d3a9ff-kube-api-access-pcnkr\") pod \"calico-apiserver-b8b79f554-vhwl6\" (UID: \"7bc41b28-ad74-4dc3-b94f-b24620d3a9ff\") " pod="calico-system/calico-apiserver-b8b79f554-vhwl6" Apr 24 23:37:11.901799 kubelet[2565]: I0424 23:37:11.901762 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6wxb\" (UniqueName: \"kubernetes.io/projected/9e14ced5-8d3f-4116-8156-faf06229fa0d-kube-api-access-t6wxb\") pod \"coredns-7d764666f9-dr55n\" (UID: \"9e14ced5-8d3f-4116-8156-faf06229fa0d\") " pod="kube-system/coredns-7d764666f9-dr55n" Apr 24 23:37:11.901993 kubelet[2565]: I0424 23:37:11.901942 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l57zt\" (UniqueName: \"kubernetes.io/projected/df938cae-ab41-409b-8afe-e6b3147b7b45-kube-api-access-l57zt\") pod \"coredns-7d764666f9-rvpdv\" (UID: \"df938cae-ab41-409b-8afe-e6b3147b7b45\") " pod="kube-system/coredns-7d764666f9-rvpdv" Apr 24 23:37:11.902125 kubelet[2565]: I0424 23:37:11.902114 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6wq4\" (UniqueName: \"kubernetes.io/projected/7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6-kube-api-access-d6wq4\") pod \"calico-kube-controllers-5c6df7864b-wqqbk\" (UID: \"7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6\") " pod="calico-system/calico-kube-controllers-5c6df7864b-wqqbk" Apr 24 23:37:11.902331 kubelet[2565]: I0424 23:37:11.902294 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhlg2\" (UniqueName: \"kubernetes.io/projected/133141ef-2c84-4659-9f62-e85180cc6f03-kube-api-access-jhlg2\") pod \"goldmane-9f7667bb8-j6dz5\" (UID: \"133141ef-2c84-4659-9f62-e85180cc6f03\") " pod="calico-system/goldmane-9f7667bb8-j6dz5" Apr 24 23:37:11.910165 systemd[1]: Created slice kubepods-besteffort-pod7a6dcb56_0ab0_4a8e_b283_aa8220fb3da6.slice - libcontainer container kubepods-besteffort-pod7a6dcb56_0ab0_4a8e_b283_aa8220fb3da6.slice. Apr 24 23:37:11.917884 systemd[1]: Created slice kubepods-besteffort-pod31b59c84_3007_40dc_82d8_2444bb185840.slice - libcontainer container kubepods-besteffort-pod31b59c84_3007_40dc_82d8_2444bb185840.slice. Apr 24 23:37:11.926522 systemd[1]: Created slice kubepods-besteffort-pod133141ef_2c84_4659_9f62_e85180cc6f03.slice - libcontainer container kubepods-besteffort-pod133141ef_2c84_4659_9f62_e85180cc6f03.slice. Apr 24 23:37:11.934368 systemd[1]: Created slice kubepods-besteffort-pod3e8ae2af_1fd8_48e0_bd54_3c4f242bc500.slice - libcontainer container kubepods-besteffort-pod3e8ae2af_1fd8_48e0_bd54_3c4f242bc500.slice. Apr 24 23:37:12.191107 containerd[1519]: time="2026-04-24T23:37:12.189736996Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-dr55n,Uid:9e14ced5-8d3f-4116-8156-faf06229fa0d,Namespace:kube-system,Attempt:0,}" Apr 24 23:37:12.200562 containerd[1519]: time="2026-04-24T23:37:12.197792932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b8b79f554-vhwl6,Uid:7bc41b28-ad74-4dc3-b94f-b24620d3a9ff,Namespace:calico-system,Attempt:0,}" Apr 24 23:37:12.211874 containerd[1519]: time="2026-04-24T23:37:12.211826663Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-rvpdv,Uid:df938cae-ab41-409b-8afe-e6b3147b7b45,Namespace:kube-system,Attempt:0,}" Apr 24 23:37:12.214123 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-122354f85186b4272846b30f2577d74eff5747b7932011aca04c0900adb51444-rootfs.mount: Deactivated successfully. Apr 24 23:37:12.218638 containerd[1519]: time="2026-04-24T23:37:12.217469214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6df7864b-wqqbk,Uid:7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6,Namespace:calico-system,Attempt:0,}" Apr 24 23:37:12.225153 containerd[1519]: time="2026-04-24T23:37:12.225096725Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b8b79f554-9kfjb,Uid:31b59c84-3007-40dc-82d8-2444bb185840,Namespace:calico-system,Attempt:0,}" Apr 24 23:37:12.233339 containerd[1519]: time="2026-04-24T23:37:12.233305393Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-j6dz5,Uid:133141ef-2c84-4659-9f62-e85180cc6f03,Namespace:calico-system,Attempt:0,}" Apr 24 23:37:12.243770 containerd[1519]: time="2026-04-24T23:37:12.243593079Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-747cf497b9-tkkg6,Uid:3e8ae2af-1fd8-48e0-bd54-3c4f242bc500,Namespace:calico-system,Attempt:0,}" Apr 24 23:37:12.383103 containerd[1519]: time="2026-04-24T23:37:12.383052133Z" level=error msg="Failed to destroy network for sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.383567 containerd[1519]: time="2026-04-24T23:37:12.383547598Z" level=error msg="encountered an error cleaning up failed sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.383695 containerd[1519]: time="2026-04-24T23:37:12.383679338Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-dr55n,Uid:9e14ced5-8d3f-4116-8156-faf06229fa0d,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.384001 kubelet[2565]: E0424 23:37:12.383941 2565 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.384049 kubelet[2565]: E0424 23:37:12.384022 2565 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-dr55n" Apr 24 23:37:12.384049 kubelet[2565]: E0424 23:37:12.384038 2565 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-dr55n" Apr 24 23:37:12.386801 kubelet[2565]: E0424 23:37:12.386763 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-dr55n_kube-system(9e14ced5-8d3f-4116-8156-faf06229fa0d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-dr55n_kube-system(9e14ced5-8d3f-4116-8156-faf06229fa0d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-dr55n" podUID="9e14ced5-8d3f-4116-8156-faf06229fa0d" Apr 24 23:37:12.402546 containerd[1519]: time="2026-04-24T23:37:12.402438726Z" level=error msg="Failed to destroy network for sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.402827 containerd[1519]: time="2026-04-24T23:37:12.402808123Z" level=error msg="encountered an error cleaning up failed sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.402895 containerd[1519]: time="2026-04-24T23:37:12.402881177Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b8b79f554-vhwl6,Uid:7bc41b28-ad74-4dc3-b94f-b24620d3a9ff,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.403226 kubelet[2565]: E0424 23:37:12.403120 2565 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.403226 kubelet[2565]: E0424 23:37:12.403179 2565 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-b8b79f554-vhwl6" Apr 24 23:37:12.403226 kubelet[2565]: E0424 23:37:12.403193 2565 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-b8b79f554-vhwl6" Apr 24 23:37:12.403322 kubelet[2565]: E0424 23:37:12.403242 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b8b79f554-vhwl6_calico-system(7bc41b28-ad74-4dc3-b94f-b24620d3a9ff)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b8b79f554-vhwl6_calico-system(7bc41b28-ad74-4dc3-b94f-b24620d3a9ff)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-b8b79f554-vhwl6" podUID="7bc41b28-ad74-4dc3-b94f-b24620d3a9ff" Apr 24 23:37:12.407489 containerd[1519]: time="2026-04-24T23:37:12.407389293Z" level=error msg="Failed to destroy network for sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.407776 containerd[1519]: time="2026-04-24T23:37:12.407758782Z" level=error msg="encountered an error cleaning up failed sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.408903 containerd[1519]: time="2026-04-24T23:37:12.407834952Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-rvpdv,Uid:df938cae-ab41-409b-8afe-e6b3147b7b45,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.408950 kubelet[2565]: E0424 23:37:12.408005 2565 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.408950 kubelet[2565]: E0424 23:37:12.408055 2565 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-rvpdv" Apr 24 23:37:12.408950 kubelet[2565]: E0424 23:37:12.408069 2565 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-rvpdv" Apr 24 23:37:12.409123 kubelet[2565]: E0424 23:37:12.408108 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-rvpdv_kube-system(df938cae-ab41-409b-8afe-e6b3147b7b45)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-rvpdv_kube-system(df938cae-ab41-409b-8afe-e6b3147b7b45)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-rvpdv" podUID="df938cae-ab41-409b-8afe-e6b3147b7b45" Apr 24 23:37:12.420858 containerd[1519]: time="2026-04-24T23:37:12.420800032Z" level=error msg="Failed to destroy network for sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.421784 containerd[1519]: time="2026-04-24T23:37:12.421336821Z" level=error msg="encountered an error cleaning up failed sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.421784 containerd[1519]: time="2026-04-24T23:37:12.421384146Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6df7864b-wqqbk,Uid:7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.421784 containerd[1519]: time="2026-04-24T23:37:12.421711607Z" level=error msg="Failed to destroy network for sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.422161 containerd[1519]: time="2026-04-24T23:37:12.422143591Z" level=error msg="encountered an error cleaning up failed sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.422250 containerd[1519]: time="2026-04-24T23:37:12.422233526Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b8b79f554-9kfjb,Uid:31b59c84-3007-40dc-82d8-2444bb185840,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.422820 kubelet[2565]: E0424 23:37:12.422773 2565 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.422904 kubelet[2565]: E0424 23:37:12.422836 2565 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-b8b79f554-9kfjb" Apr 24 23:37:12.422904 kubelet[2565]: E0424 23:37:12.422856 2565 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-b8b79f554-9kfjb" Apr 24 23:37:12.422960 kubelet[2565]: E0424 23:37:12.422919 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b8b79f554-9kfjb_calico-system(31b59c84-3007-40dc-82d8-2444bb185840)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b8b79f554-9kfjb_calico-system(31b59c84-3007-40dc-82d8-2444bb185840)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-b8b79f554-9kfjb" podUID="31b59c84-3007-40dc-82d8-2444bb185840" Apr 24 23:37:12.423284 kubelet[2565]: E0424 23:37:12.423164 2565 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.423284 kubelet[2565]: E0424 23:37:12.423195 2565 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5c6df7864b-wqqbk" Apr 24 23:37:12.423284 kubelet[2565]: E0424 23:37:12.423220 2565 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5c6df7864b-wqqbk" Apr 24 23:37:12.423367 kubelet[2565]: E0424 23:37:12.423257 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5c6df7864b-wqqbk_calico-system(7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5c6df7864b-wqqbk_calico-system(7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5c6df7864b-wqqbk" podUID="7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6" Apr 24 23:37:12.447942 containerd[1519]: time="2026-04-24T23:37:12.446999463Z" level=error msg="Failed to destroy network for sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.448802 containerd[1519]: time="2026-04-24T23:37:12.448305078Z" level=error msg="encountered an error cleaning up failed sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.448802 containerd[1519]: time="2026-04-24T23:37:12.448357560Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-747cf497b9-tkkg6,Uid:3e8ae2af-1fd8-48e0-bd54-3c4f242bc500,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.448934 kubelet[2565]: E0424 23:37:12.448585 2565 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.448934 kubelet[2565]: E0424 23:37:12.448718 2565 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-747cf497b9-tkkg6" Apr 24 23:37:12.448934 kubelet[2565]: E0424 23:37:12.448772 2565 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-747cf497b9-tkkg6" Apr 24 23:37:12.450058 kubelet[2565]: E0424 23:37:12.448820 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-747cf497b9-tkkg6_calico-system(3e8ae2af-1fd8-48e0-bd54-3c4f242bc500)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-747cf497b9-tkkg6_calico-system(3e8ae2af-1fd8-48e0-bd54-3c4f242bc500)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-747cf497b9-tkkg6" podUID="3e8ae2af-1fd8-48e0-bd54-3c4f242bc500" Apr 24 23:37:12.455486 containerd[1519]: time="2026-04-24T23:37:12.455407122Z" level=error msg="Failed to destroy network for sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.455801 containerd[1519]: time="2026-04-24T23:37:12.455770748Z" level=error msg="encountered an error cleaning up failed sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.455855 containerd[1519]: time="2026-04-24T23:37:12.455806410Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-j6dz5,Uid:133141ef-2c84-4659-9f62-e85180cc6f03,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.456018 kubelet[2565]: E0424 23:37:12.455973 2565 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.456075 kubelet[2565]: E0424 23:37:12.456027 2565 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-j6dz5" Apr 24 23:37:12.456075 kubelet[2565]: E0424 23:37:12.456041 2565 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-j6dz5" Apr 24 23:37:12.456135 kubelet[2565]: E0424 23:37:12.456076 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-j6dz5_calico-system(133141ef-2c84-4659-9f62-e85180cc6f03)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-j6dz5_calico-system(133141ef-2c84-4659-9f62-e85180cc6f03)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-j6dz5" podUID="133141ef-2c84-4659-9f62-e85180cc6f03" Apr 24 23:37:12.482696 systemd[1]: Created slice kubepods-besteffort-podb8636bb0_afe6_4f7c_8c11_499ada35dc8f.slice - libcontainer container kubepods-besteffort-podb8636bb0_afe6_4f7c_8c11_499ada35dc8f.slice. Apr 24 23:37:12.486350 containerd[1519]: time="2026-04-24T23:37:12.486314214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kfkvn,Uid:b8636bb0-afe6-4f7c-8c11-499ada35dc8f,Namespace:calico-system,Attempt:0,}" Apr 24 23:37:12.533705 containerd[1519]: time="2026-04-24T23:37:12.533636811Z" level=error msg="Failed to destroy network for sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.533994 containerd[1519]: time="2026-04-24T23:37:12.533967722Z" level=error msg="encountered an error cleaning up failed sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.534033 containerd[1519]: time="2026-04-24T23:37:12.534008853Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kfkvn,Uid:b8636bb0-afe6-4f7c-8c11-499ada35dc8f,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.534572 kubelet[2565]: E0424 23:37:12.534205 2565 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.534572 kubelet[2565]: E0424 23:37:12.534253 2565 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kfkvn" Apr 24 23:37:12.534572 kubelet[2565]: E0424 23:37:12.534270 2565 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kfkvn" Apr 24 23:37:12.534669 kubelet[2565]: E0424 23:37:12.534313 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kfkvn_calico-system(b8636bb0-afe6-4f7c-8c11-499ada35dc8f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kfkvn_calico-system(b8636bb0-afe6-4f7c-8c11-499ada35dc8f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kfkvn" podUID="b8636bb0-afe6-4f7c-8c11-499ada35dc8f" Apr 24 23:37:12.605363 kubelet[2565]: I0424 23:37:12.605298 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:12.606751 containerd[1519]: time="2026-04-24T23:37:12.606671427Z" level=info msg="StopPodSandbox for \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\"" Apr 24 23:37:12.607382 containerd[1519]: time="2026-04-24T23:37:12.606993440Z" level=info msg="Ensure that sandbox e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b in task-service has been cleanup successfully" Apr 24 23:37:12.612528 kubelet[2565]: I0424 23:37:12.612482 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:12.614540 containerd[1519]: time="2026-04-24T23:37:12.613930516Z" level=info msg="StopPodSandbox for \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\"" Apr 24 23:37:12.614540 containerd[1519]: time="2026-04-24T23:37:12.614187761Z" level=info msg="Ensure that sandbox 5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469 in task-service has been cleanup successfully" Apr 24 23:37:12.631570 kubelet[2565]: I0424 23:37:12.631551 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:12.632493 containerd[1519]: time="2026-04-24T23:37:12.632474174Z" level=info msg="StopPodSandbox for \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\"" Apr 24 23:37:12.632909 containerd[1519]: time="2026-04-24T23:37:12.632894616Z" level=info msg="Ensure that sandbox 580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e in task-service has been cleanup successfully" Apr 24 23:37:12.634517 kubelet[2565]: I0424 23:37:12.634124 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:12.634851 containerd[1519]: time="2026-04-24T23:37:12.634835580Z" level=info msg="StopPodSandbox for \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\"" Apr 24 23:37:12.635917 containerd[1519]: time="2026-04-24T23:37:12.635866587Z" level=info msg="Ensure that sandbox fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c in task-service has been cleanup successfully" Apr 24 23:37:12.646699 kubelet[2565]: I0424 23:37:12.644645 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:12.646757 containerd[1519]: time="2026-04-24T23:37:12.646398053Z" level=info msg="StopPodSandbox for \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\"" Apr 24 23:37:12.646757 containerd[1519]: time="2026-04-24T23:37:12.646515787Z" level=info msg="Ensure that sandbox 436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7 in task-service has been cleanup successfully" Apr 24 23:37:12.655997 kubelet[2565]: I0424 23:37:12.655972 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:12.656893 containerd[1519]: time="2026-04-24T23:37:12.656861916Z" level=info msg="StopPodSandbox for \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\"" Apr 24 23:37:12.657046 containerd[1519]: time="2026-04-24T23:37:12.657012748Z" level=info msg="Ensure that sandbox b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5 in task-service has been cleanup successfully" Apr 24 23:37:12.662905 containerd[1519]: time="2026-04-24T23:37:12.662854510Z" level=info msg="CreateContainer within sandbox \"6851cbeda857f9a0cb3a1669fb5671016e8e02f41f66d2981a1dd2b1b7914241\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Apr 24 23:37:12.665673 kubelet[2565]: I0424 23:37:12.665641 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:12.671571 containerd[1519]: time="2026-04-24T23:37:12.671541351Z" level=info msg="StopPodSandbox for \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\"" Apr 24 23:37:12.671805 containerd[1519]: time="2026-04-24T23:37:12.671776637Z" level=info msg="Ensure that sandbox ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb in task-service has been cleanup successfully" Apr 24 23:37:12.682505 kubelet[2565]: I0424 23:37:12.682480 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:12.684622 containerd[1519]: time="2026-04-24T23:37:12.684603243Z" level=info msg="StopPodSandbox for \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\"" Apr 24 23:37:12.684814 containerd[1519]: time="2026-04-24T23:37:12.684802626Z" level=info msg="Ensure that sandbox c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120 in task-service has been cleanup successfully" Apr 24 23:37:12.712980 containerd[1519]: time="2026-04-24T23:37:12.712868469Z" level=error msg="StopPodSandbox for \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\" failed" error="failed to destroy network for sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.713518 kubelet[2565]: E0424 23:37:12.713079 2565 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:12.713518 kubelet[2565]: E0424 23:37:12.713117 2565 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e"} Apr 24 23:37:12.713518 kubelet[2565]: E0424 23:37:12.713175 2565 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:37:12.713518 kubelet[2565]: E0424 23:37:12.713193 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5c6df7864b-wqqbk" podUID="7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6" Apr 24 23:37:12.717050 containerd[1519]: time="2026-04-24T23:37:12.716959754Z" level=info msg="CreateContainer within sandbox \"6851cbeda857f9a0cb3a1669fb5671016e8e02f41f66d2981a1dd2b1b7914241\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"0456a6ae3f2d716bfb891c2c5b8c50f3627c5b8f50efed751a84bae026cc4854\"" Apr 24 23:37:12.718065 containerd[1519]: time="2026-04-24T23:37:12.717963203Z" level=info msg="StartContainer for \"0456a6ae3f2d716bfb891c2c5b8c50f3627c5b8f50efed751a84bae026cc4854\"" Apr 24 23:37:12.748536 containerd[1519]: time="2026-04-24T23:37:12.748494241Z" level=error msg="StopPodSandbox for \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\" failed" error="failed to destroy network for sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.748970 kubelet[2565]: E0424 23:37:12.748650 2565 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:12.748970 kubelet[2565]: E0424 23:37:12.748689 2565 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7"} Apr 24 23:37:12.748970 kubelet[2565]: E0424 23:37:12.748711 2565 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7bc41b28-ad74-4dc3-b94f-b24620d3a9ff\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:37:12.748970 kubelet[2565]: E0424 23:37:12.748730 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7bc41b28-ad74-4dc3-b94f-b24620d3a9ff\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-b8b79f554-vhwl6" podUID="7bc41b28-ad74-4dc3-b94f-b24620d3a9ff" Apr 24 23:37:12.752368 containerd[1519]: time="2026-04-24T23:37:12.752289864Z" level=error msg="StopPodSandbox for \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\" failed" error="failed to destroy network for sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.752617 kubelet[2565]: E0424 23:37:12.752510 2565 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:12.752617 kubelet[2565]: E0424 23:37:12.752562 2565 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb"} Apr 24 23:37:12.752617 kubelet[2565]: E0424 23:37:12.752581 2565 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"133141ef-2c84-4659-9f62-e85180cc6f03\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:37:12.752617 kubelet[2565]: E0424 23:37:12.752598 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"133141ef-2c84-4659-9f62-e85180cc6f03\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-j6dz5" podUID="133141ef-2c84-4659-9f62-e85180cc6f03" Apr 24 23:37:12.757527 containerd[1519]: time="2026-04-24T23:37:12.757492215Z" level=error msg="StopPodSandbox for \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\" failed" error="failed to destroy network for sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.757994 kubelet[2565]: E0424 23:37:12.757617 2565 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:12.757994 kubelet[2565]: E0424 23:37:12.757647 2565 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b"} Apr 24 23:37:12.757994 kubelet[2565]: E0424 23:37:12.757665 2565 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"31b59c84-3007-40dc-82d8-2444bb185840\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:37:12.757994 kubelet[2565]: E0424 23:37:12.757686 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"31b59c84-3007-40dc-82d8-2444bb185840\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-b8b79f554-9kfjb" podUID="31b59c84-3007-40dc-82d8-2444bb185840" Apr 24 23:37:12.765815 containerd[1519]: time="2026-04-24T23:37:12.765774711Z" level=error msg="StopPodSandbox for \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\" failed" error="failed to destroy network for sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.765956 kubelet[2565]: E0424 23:37:12.765921 2565 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:12.765990 kubelet[2565]: E0424 23:37:12.765980 2565 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120"} Apr 24 23:37:12.766062 kubelet[2565]: E0424 23:37:12.766002 2565 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:37:12.766062 kubelet[2565]: E0424 23:37:12.766020 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-747cf497b9-tkkg6" podUID="3e8ae2af-1fd8-48e0-bd54-3c4f242bc500" Apr 24 23:37:12.769509 containerd[1519]: time="2026-04-24T23:37:12.769391090Z" level=error msg="StopPodSandbox for \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\" failed" error="failed to destroy network for sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.769917 kubelet[2565]: E0424 23:37:12.769654 2565 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:12.769917 kubelet[2565]: E0424 23:37:12.769681 2565 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469"} Apr 24 23:37:12.769917 kubelet[2565]: E0424 23:37:12.769735 2565 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9e14ced5-8d3f-4116-8156-faf06229fa0d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:37:12.769917 kubelet[2565]: E0424 23:37:12.769751 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9e14ced5-8d3f-4116-8156-faf06229fa0d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-dr55n" podUID="9e14ced5-8d3f-4116-8156-faf06229fa0d" Apr 24 23:37:12.770547 containerd[1519]: time="2026-04-24T23:37:12.770522186Z" level=error msg="StopPodSandbox for \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\" failed" error="failed to destroy network for sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.770640 kubelet[2565]: E0424 23:37:12.770619 2565 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:12.770685 kubelet[2565]: E0424 23:37:12.770641 2565 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5"} Apr 24 23:37:12.770685 kubelet[2565]: E0424 23:37:12.770659 2565 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"b8636bb0-afe6-4f7c-8c11-499ada35dc8f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:37:12.770685 kubelet[2565]: E0424 23:37:12.770673 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"b8636bb0-afe6-4f7c-8c11-499ada35dc8f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kfkvn" podUID="b8636bb0-afe6-4f7c-8c11-499ada35dc8f" Apr 24 23:37:12.770979 containerd[1519]: time="2026-04-24T23:37:12.770951737Z" level=error msg="StopPodSandbox for \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\" failed" error="failed to destroy network for sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:37:12.771168 kubelet[2565]: E0424 23:37:12.771144 2565 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:12.771168 kubelet[2565]: E0424 23:37:12.771167 2565 kuberuntime_manager.go:1881] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c"} Apr 24 23:37:12.771238 kubelet[2565]: E0424 23:37:12.771191 2565 kuberuntime_manager.go:1422] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"df938cae-ab41-409b-8afe-e6b3147b7b45\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Apr 24 23:37:12.771238 kubelet[2565]: E0424 23:37:12.771205 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"df938cae-ab41-409b-8afe-e6b3147b7b45\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-rvpdv" podUID="df938cae-ab41-409b-8afe-e6b3147b7b45" Apr 24 23:37:12.781532 systemd[1]: Started cri-containerd-0456a6ae3f2d716bfb891c2c5b8c50f3627c5b8f50efed751a84bae026cc4854.scope - libcontainer container 0456a6ae3f2d716bfb891c2c5b8c50f3627c5b8f50efed751a84bae026cc4854. Apr 24 23:37:12.811147 containerd[1519]: time="2026-04-24T23:37:12.811120623Z" level=info msg="StartContainer for \"0456a6ae3f2d716bfb891c2c5b8c50f3627c5b8f50efed751a84bae026cc4854\" returns successfully" Apr 24 23:37:13.198156 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b-shm.mount: Deactivated successfully. Apr 24 23:37:13.198696 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e-shm.mount: Deactivated successfully. Apr 24 23:37:13.198840 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7-shm.mount: Deactivated successfully. Apr 24 23:37:13.198973 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469-shm.mount: Deactivated successfully. Apr 24 23:37:13.690812 containerd[1519]: time="2026-04-24T23:37:13.689660414Z" level=info msg="StopPodSandbox for \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\"" Apr 24 23:37:13.770034 kubelet[2565]: I0424 23:37:13.769921 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-h74r6" podStartSLOduration=2.126573021 podStartE2EDuration="15.769909347s" podCreationTimestamp="2026-04-24 23:36:58 +0000 UTC" firstStartedPulling="2026-04-24 23:36:58.984508176 +0000 UTC m=+17.596566981" lastFinishedPulling="2026-04-24 23:37:12.627844513 +0000 UTC m=+31.239903307" observedRunningTime="2026-04-24 23:37:13.746855914 +0000 UTC m=+32.358914718" watchObservedRunningTime="2026-04-24 23:37:13.769909347 +0000 UTC m=+32.381968151" Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.768 [INFO][3925] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.769 [INFO][3925] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" iface="eth0" netns="/var/run/netns/cni-e3cbc41d-34b9-8b96-99d8-f39310a5c022" Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.769 [INFO][3925] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" iface="eth0" netns="/var/run/netns/cni-e3cbc41d-34b9-8b96-99d8-f39310a5c022" Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.769 [INFO][3925] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" iface="eth0" netns="/var/run/netns/cni-e3cbc41d-34b9-8b96-99d8-f39310a5c022" Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.769 [INFO][3925] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.769 [INFO][3925] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.789 [INFO][3933] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" HandleID="k8s-pod-network.c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--747cf497b9--tkkg6-eth0" Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.789 [INFO][3933] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.789 [INFO][3933] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.793 [WARNING][3933] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" HandleID="k8s-pod-network.c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--747cf497b9--tkkg6-eth0" Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.793 [INFO][3933] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" HandleID="k8s-pod-network.c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--747cf497b9--tkkg6-eth0" Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.795 [INFO][3933] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:13.800112 containerd[1519]: 2026-04-24 23:37:13.798 [INFO][3925] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:13.802534 containerd[1519]: time="2026-04-24T23:37:13.802467089Z" level=info msg="TearDown network for sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\" successfully" Apr 24 23:37:13.802534 containerd[1519]: time="2026-04-24T23:37:13.802504635Z" level=info msg="StopPodSandbox for \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\" returns successfully" Apr 24 23:37:13.802916 systemd[1]: run-netns-cni\x2de3cbc41d\x2d34b9\x2d8b96\x2d99d8\x2df39310a5c022.mount: Deactivated successfully. Apr 24 23:37:13.919149 kubelet[2565]: I0424 23:37:13.919013 2565 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-kube-api-access-t4s4q\" (UniqueName: \"kubernetes.io/projected/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-kube-api-access-t4s4q\") pod \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\" (UID: \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\") " Apr 24 23:37:13.919149 kubelet[2565]: I0424 23:37:13.919128 2565 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-whisker-ca-bundle\") pod \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\" (UID: \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\") " Apr 24 23:37:13.919407 kubelet[2565]: I0424 23:37:13.919174 2565 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-whisker-backend-key-pair\") pod \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\" (UID: \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\") " Apr 24 23:37:13.919407 kubelet[2565]: I0424 23:37:13.919218 2565 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-nginx-config\" (UniqueName: \"kubernetes.io/configmap/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-nginx-config\") pod \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\" (UID: \"3e8ae2af-1fd8-48e0-bd54-3c4f242bc500\") " Apr 24 23:37:13.920827 kubelet[2565]: I0424 23:37:13.919975 2565 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-nginx-config" pod "3e8ae2af-1fd8-48e0-bd54-3c4f242bc500" (UID: "3e8ae2af-1fd8-48e0-bd54-3c4f242bc500"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:37:13.920827 kubelet[2565]: I0424 23:37:13.920339 2565 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-whisker-ca-bundle" pod "3e8ae2af-1fd8-48e0-bd54-3c4f242bc500" (UID: "3e8ae2af-1fd8-48e0-bd54-3c4f242bc500"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:37:13.927129 kubelet[2565]: I0424 23:37:13.927090 2565 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-kube-api-access-t4s4q" pod "3e8ae2af-1fd8-48e0-bd54-3c4f242bc500" (UID: "3e8ae2af-1fd8-48e0-bd54-3c4f242bc500"). InnerVolumeSpecName "kube-api-access-t4s4q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:37:13.929816 kubelet[2565]: I0424 23:37:13.929772 2565 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-whisker-backend-key-pair" pod "3e8ae2af-1fd8-48e0-bd54-3c4f242bc500" (UID: "3e8ae2af-1fd8-48e0-bd54-3c4f242bc500"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:37:13.932483 systemd[1]: var-lib-kubelet-pods-3e8ae2af\x2d1fd8\x2d48e0\x2dbd54\x2d3c4f242bc500-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dt4s4q.mount: Deactivated successfully. Apr 24 23:37:13.938924 systemd[1]: var-lib-kubelet-pods-3e8ae2af\x2d1fd8\x2d48e0\x2dbd54\x2d3c4f242bc500-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Apr 24 23:37:14.020275 kubelet[2565]: I0424 23:37:14.020001 2565 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-whisker-ca-bundle\") on node \"ci-4081-3-6-n-6f01bfed3c\" DevicePath \"\"" Apr 24 23:37:14.020275 kubelet[2565]: I0424 23:37:14.020059 2565 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-whisker-backend-key-pair\") on node \"ci-4081-3-6-n-6f01bfed3c\" DevicePath \"\"" Apr 24 23:37:14.020275 kubelet[2565]: I0424 23:37:14.020089 2565 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-nginx-config\") on node \"ci-4081-3-6-n-6f01bfed3c\" DevicePath \"\"" Apr 24 23:37:14.020275 kubelet[2565]: I0424 23:37:14.020112 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t4s4q\" (UniqueName: \"kubernetes.io/projected/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500-kube-api-access-t4s4q\") on node \"ci-4081-3-6-n-6f01bfed3c\" DevicePath \"\"" Apr 24 23:37:14.350446 kernel: calico-node[4027]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Apr 24 23:37:14.716620 systemd[1]: Removed slice kubepods-besteffort-pod3e8ae2af_1fd8_48e0_bd54_3c4f242bc500.slice - libcontainer container kubepods-besteffort-pod3e8ae2af_1fd8_48e0_bd54_3c4f242bc500.slice. Apr 24 23:37:14.774501 systemd[1]: Created slice kubepods-besteffort-pode3fdf084_2fae_4548_8aba_1d7febec230a.slice - libcontainer container kubepods-besteffort-pode3fdf084_2fae_4548_8aba_1d7febec230a.slice. Apr 24 23:37:14.805401 systemd-networkd[1422]: vxlan.calico: Link UP Apr 24 23:37:14.805423 systemd-networkd[1422]: vxlan.calico: Gained carrier Apr 24 23:37:14.825225 kubelet[2565]: I0424 23:37:14.825071 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/e3fdf084-2fae-4548-8aba-1d7febec230a-nginx-config\") pod \"whisker-85fd66b966-lfn59\" (UID: \"e3fdf084-2fae-4548-8aba-1d7febec230a\") " pod="calico-system/whisker-85fd66b966-lfn59" Apr 24 23:37:14.825225 kubelet[2565]: I0424 23:37:14.825096 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-877dn\" (UniqueName: \"kubernetes.io/projected/e3fdf084-2fae-4548-8aba-1d7febec230a-kube-api-access-877dn\") pod \"whisker-85fd66b966-lfn59\" (UID: \"e3fdf084-2fae-4548-8aba-1d7febec230a\") " pod="calico-system/whisker-85fd66b966-lfn59" Apr 24 23:37:14.825225 kubelet[2565]: I0424 23:37:14.825114 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e3fdf084-2fae-4548-8aba-1d7febec230a-whisker-backend-key-pair\") pod \"whisker-85fd66b966-lfn59\" (UID: \"e3fdf084-2fae-4548-8aba-1d7febec230a\") " pod="calico-system/whisker-85fd66b966-lfn59" Apr 24 23:37:14.825225 kubelet[2565]: I0424 23:37:14.825124 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3fdf084-2fae-4548-8aba-1d7febec230a-whisker-ca-bundle\") pod \"whisker-85fd66b966-lfn59\" (UID: \"e3fdf084-2fae-4548-8aba-1d7febec230a\") " pod="calico-system/whisker-85fd66b966-lfn59" Apr 24 23:37:15.081165 containerd[1519]: time="2026-04-24T23:37:15.079632052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85fd66b966-lfn59,Uid:e3fdf084-2fae-4548-8aba-1d7febec230a,Namespace:calico-system,Attempt:0,}" Apr 24 23:37:15.192060 systemd-networkd[1422]: cali89cc691ee66: Link UP Apr 24 23:37:15.192278 systemd-networkd[1422]: cali89cc691ee66: Gained carrier Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.133 [INFO][4173] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0 whisker-85fd66b966- calico-system e3fdf084-2fae-4548-8aba-1d7febec230a 890 0 2026-04-24 23:37:14 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:85fd66b966 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-6-n-6f01bfed3c whisker-85fd66b966-lfn59 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali89cc691ee66 [] [] }} ContainerID="a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" Namespace="calico-system" Pod="whisker-85fd66b966-lfn59" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.133 [INFO][4173] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" Namespace="calico-system" Pod="whisker-85fd66b966-lfn59" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.154 [INFO][4188] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" HandleID="k8s-pod-network.a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.159 [INFO][4188] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" HandleID="k8s-pod-network.a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277350), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-6f01bfed3c", "pod":"whisker-85fd66b966-lfn59", "timestamp":"2026-04-24 23:37:15.15469826 +0000 UTC"}, Hostname:"ci-4081-3-6-n-6f01bfed3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00032b1e0)} Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.159 [INFO][4188] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.159 [INFO][4188] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.159 [INFO][4188] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-6f01bfed3c' Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.161 [INFO][4188] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.165 [INFO][4188] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.168 [INFO][4188] ipam/ipam.go 526: Trying affinity for 192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.169 [INFO][4188] ipam/ipam.go 160: Attempting to load block cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.171 [INFO][4188] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.171 [INFO][4188] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.172 [INFO][4188] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3 Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.175 [INFO][4188] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.180 [INFO][4188] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.35.65/26] block=192.168.35.64/26 handle="k8s-pod-network.a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.180 [INFO][4188] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.35.65/26] handle="k8s-pod-network.a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.180 [INFO][4188] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:15.211444 containerd[1519]: 2026-04-24 23:37:15.180 [INFO][4188] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.35.65/26] IPv6=[] ContainerID="a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" HandleID="k8s-pod-network.a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0" Apr 24 23:37:15.213018 containerd[1519]: 2026-04-24 23:37:15.182 [INFO][4173] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" Namespace="calico-system" Pod="whisker-85fd66b966-lfn59" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0", GenerateName:"whisker-85fd66b966-", Namespace:"calico-system", SelfLink:"", UID:"e3fdf084-2fae-4548-8aba-1d7febec230a", ResourceVersion:"890", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 37, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"85fd66b966", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"", Pod:"whisker-85fd66b966-lfn59", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.35.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali89cc691ee66", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:15.213018 containerd[1519]: 2026-04-24 23:37:15.182 [INFO][4173] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.65/32] ContainerID="a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" Namespace="calico-system" Pod="whisker-85fd66b966-lfn59" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0" Apr 24 23:37:15.213018 containerd[1519]: 2026-04-24 23:37:15.182 [INFO][4173] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali89cc691ee66 ContainerID="a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" Namespace="calico-system" Pod="whisker-85fd66b966-lfn59" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0" Apr 24 23:37:15.213018 containerd[1519]: 2026-04-24 23:37:15.190 [INFO][4173] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" Namespace="calico-system" Pod="whisker-85fd66b966-lfn59" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0" Apr 24 23:37:15.213018 containerd[1519]: 2026-04-24 23:37:15.190 [INFO][4173] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" Namespace="calico-system" Pod="whisker-85fd66b966-lfn59" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0", GenerateName:"whisker-85fd66b966-", Namespace:"calico-system", SelfLink:"", UID:"e3fdf084-2fae-4548-8aba-1d7febec230a", ResourceVersion:"890", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 37, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"85fd66b966", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3", Pod:"whisker-85fd66b966-lfn59", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.35.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali89cc691ee66", MAC:"96:8f:3e:f0:92:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:15.213018 containerd[1519]: 2026-04-24 23:37:15.207 [INFO][4173] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3" Namespace="calico-system" Pod="whisker-85fd66b966-lfn59" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--85fd66b966--lfn59-eth0" Apr 24 23:37:15.242337 containerd[1519]: time="2026-04-24T23:37:15.241985894Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:37:15.242337 containerd[1519]: time="2026-04-24T23:37:15.242046323Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:37:15.242337 containerd[1519]: time="2026-04-24T23:37:15.242056429Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:15.242337 containerd[1519]: time="2026-04-24T23:37:15.242137401Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:15.272586 systemd[1]: Started cri-containerd-a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3.scope - libcontainer container a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3. Apr 24 23:37:15.329885 containerd[1519]: time="2026-04-24T23:37:15.329840476Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85fd66b966-lfn59,Uid:e3fdf084-2fae-4548-8aba-1d7febec230a,Namespace:calico-system,Attempt:0,} returns sandbox id \"a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3\"" Apr 24 23:37:15.332715 containerd[1519]: time="2026-04-24T23:37:15.332522622Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Apr 24 23:37:15.481596 kubelet[2565]: I0424 23:37:15.481501 2565 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="3e8ae2af-1fd8-48e0-bd54-3c4f242bc500" path="/var/lib/kubelet/pods/3e8ae2af-1fd8-48e0-bd54-3c4f242bc500/volumes" Apr 24 23:37:16.393713 systemd-networkd[1422]: vxlan.calico: Gained IPv6LL Apr 24 23:37:16.777933 systemd-networkd[1422]: cali89cc691ee66: Gained IPv6LL Apr 24 23:37:16.987935 containerd[1519]: time="2026-04-24T23:37:16.987888386Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:16.989028 containerd[1519]: time="2026-04-24T23:37:16.988932316Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Apr 24 23:37:16.990090 containerd[1519]: time="2026-04-24T23:37:16.990024861Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:16.991822 containerd[1519]: time="2026-04-24T23:37:16.991794385Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:16.992349 containerd[1519]: time="2026-04-24T23:37:16.992214664Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.659668921s" Apr 24 23:37:16.992349 containerd[1519]: time="2026-04-24T23:37:16.992236723Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Apr 24 23:37:16.995913 containerd[1519]: time="2026-04-24T23:37:16.995889361Z" level=info msg="CreateContainer within sandbox \"a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Apr 24 23:37:17.009178 containerd[1519]: time="2026-04-24T23:37:17.009159240Z" level=info msg="CreateContainer within sandbox \"a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"ac39f2dec9313c191d2d64f3dd4ca8886261629ad230c5771e0aae05edbb9b5d\"" Apr 24 23:37:17.010205 containerd[1519]: time="2026-04-24T23:37:17.009552880Z" level=info msg="StartContainer for \"ac39f2dec9313c191d2d64f3dd4ca8886261629ad230c5771e0aae05edbb9b5d\"" Apr 24 23:37:17.041532 systemd[1]: Started cri-containerd-ac39f2dec9313c191d2d64f3dd4ca8886261629ad230c5771e0aae05edbb9b5d.scope - libcontainer container ac39f2dec9313c191d2d64f3dd4ca8886261629ad230c5771e0aae05edbb9b5d. Apr 24 23:37:17.071674 containerd[1519]: time="2026-04-24T23:37:17.070949111Z" level=info msg="StartContainer for \"ac39f2dec9313c191d2d64f3dd4ca8886261629ad230c5771e0aae05edbb9b5d\" returns successfully" Apr 24 23:37:17.072307 containerd[1519]: time="2026-04-24T23:37:17.072158187Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Apr 24 23:37:18.913070 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2945654385.mount: Deactivated successfully. Apr 24 23:37:18.927067 containerd[1519]: time="2026-04-24T23:37:18.927029146Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:18.931539 containerd[1519]: time="2026-04-24T23:37:18.931500778Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Apr 24 23:37:18.931904 containerd[1519]: time="2026-04-24T23:37:18.931882142Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:18.933955 containerd[1519]: time="2026-04-24T23:37:18.933918575Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:18.934542 containerd[1519]: time="2026-04-24T23:37:18.934325240Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 1.862142699s" Apr 24 23:37:18.934542 containerd[1519]: time="2026-04-24T23:37:18.934347444Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Apr 24 23:37:18.937921 containerd[1519]: time="2026-04-24T23:37:18.937901303Z" level=info msg="CreateContainer within sandbox \"a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Apr 24 23:37:18.955181 containerd[1519]: time="2026-04-24T23:37:18.955146168Z" level=info msg="CreateContainer within sandbox \"a576f9eaf07e286548141edabcae696ca2e39c8470a0f3db099de55c8a0567a3\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"8fcf729bf3adcc21fba63dd3707712f85c25dfddbf1b8ddf9840491581bf7b53\"" Apr 24 23:37:18.955588 containerd[1519]: time="2026-04-24T23:37:18.955567869Z" level=info msg="StartContainer for \"8fcf729bf3adcc21fba63dd3707712f85c25dfddbf1b8ddf9840491581bf7b53\"" Apr 24 23:37:18.980564 systemd[1]: Started cri-containerd-8fcf729bf3adcc21fba63dd3707712f85c25dfddbf1b8ddf9840491581bf7b53.scope - libcontainer container 8fcf729bf3adcc21fba63dd3707712f85c25dfddbf1b8ddf9840491581bf7b53. Apr 24 23:37:19.012150 containerd[1519]: time="2026-04-24T23:37:19.012111660Z" level=info msg="StartContainer for \"8fcf729bf3adcc21fba63dd3707712f85c25dfddbf1b8ddf9840491581bf7b53\" returns successfully" Apr 24 23:37:25.480139 containerd[1519]: time="2026-04-24T23:37:25.479063257Z" level=info msg="StopPodSandbox for \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\"" Apr 24 23:37:25.482334 containerd[1519]: time="2026-04-24T23:37:25.481334863Z" level=info msg="StopPodSandbox for \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\"" Apr 24 23:37:25.483225 containerd[1519]: time="2026-04-24T23:37:25.483152084Z" level=info msg="StopPodSandbox for \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\"" Apr 24 23:37:25.566784 kubelet[2565]: I0424 23:37:25.565834 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-85fd66b966-lfn59" podStartSLOduration=7.9618950040000005 podStartE2EDuration="11.565821425s" podCreationTimestamp="2026-04-24 23:37:14 +0000 UTC" firstStartedPulling="2026-04-24 23:37:15.331245821 +0000 UTC m=+33.943304625" lastFinishedPulling="2026-04-24 23:37:18.935172242 +0000 UTC m=+37.547231046" observedRunningTime="2026-04-24 23:37:19.728743689 +0000 UTC m=+38.340802533" watchObservedRunningTime="2026-04-24 23:37:25.565821425 +0000 UTC m=+44.177880219" Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.574 [INFO][4443] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.574 [INFO][4443] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" iface="eth0" netns="/var/run/netns/cni-9248ebf0-44fa-f161-2fb9-3ce9517c5765" Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.575 [INFO][4443] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" iface="eth0" netns="/var/run/netns/cni-9248ebf0-44fa-f161-2fb9-3ce9517c5765" Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.575 [INFO][4443] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" iface="eth0" netns="/var/run/netns/cni-9248ebf0-44fa-f161-2fb9-3ce9517c5765" Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.575 [INFO][4443] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.575 [INFO][4443] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.601 [INFO][4470] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" HandleID="k8s-pod-network.b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.601 [INFO][4470] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.601 [INFO][4470] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.608 [WARNING][4470] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" HandleID="k8s-pod-network.b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.608 [INFO][4470] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" HandleID="k8s-pod-network.b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.610 [INFO][4470] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:25.615591 containerd[1519]: 2026-04-24 23:37:25.613 [INFO][4443] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:25.617603 containerd[1519]: time="2026-04-24T23:37:25.617534267Z" level=info msg="TearDown network for sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\" successfully" Apr 24 23:37:25.617640 containerd[1519]: time="2026-04-24T23:37:25.617599560Z" level=info msg="StopPodSandbox for \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\" returns successfully" Apr 24 23:37:25.621664 systemd[1]: run-netns-cni\x2d9248ebf0\x2d44fa\x2df161\x2d2fb9\x2d3ce9517c5765.mount: Deactivated successfully. Apr 24 23:37:25.624431 containerd[1519]: time="2026-04-24T23:37:25.624374508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kfkvn,Uid:b8636bb0-afe6-4f7c-8c11-499ada35dc8f,Namespace:calico-system,Attempt:1,}" Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.564 [INFO][4441] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.565 [INFO][4441] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" iface="eth0" netns="/var/run/netns/cni-96997954-d7f1-7ad4-987c-1c22872d1eb5" Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.567 [INFO][4441] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" iface="eth0" netns="/var/run/netns/cni-96997954-d7f1-7ad4-987c-1c22872d1eb5" Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.568 [INFO][4441] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" iface="eth0" netns="/var/run/netns/cni-96997954-d7f1-7ad4-987c-1c22872d1eb5" Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.568 [INFO][4441] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.568 [INFO][4441] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.607 [INFO][4462] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" HandleID="k8s-pod-network.5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.607 [INFO][4462] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.610 [INFO][4462] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.616 [WARNING][4462] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" HandleID="k8s-pod-network.5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.616 [INFO][4462] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" HandleID="k8s-pod-network.5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.619 [INFO][4462] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:25.626996 containerd[1519]: 2026-04-24 23:37:25.624 [INFO][4441] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:25.627835 containerd[1519]: time="2026-04-24T23:37:25.627453602Z" level=info msg="TearDown network for sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\" successfully" Apr 24 23:37:25.627835 containerd[1519]: time="2026-04-24T23:37:25.627471499Z" level=info msg="StopPodSandbox for \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\" returns successfully" Apr 24 23:37:25.631592 containerd[1519]: time="2026-04-24T23:37:25.631066196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-dr55n,Uid:9e14ced5-8d3f-4116-8156-faf06229fa0d,Namespace:kube-system,Attempt:1,}" Apr 24 23:37:25.632943 systemd[1]: run-netns-cni\x2d96997954\x2dd7f1\x2d7ad4\x2d987c\x2d1c22872d1eb5.mount: Deactivated successfully. Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.566 [INFO][4442] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.566 [INFO][4442] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" iface="eth0" netns="/var/run/netns/cni-d97ae6db-78ae-9ef0-0122-1bfb0c6f3bb1" Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.567 [INFO][4442] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" iface="eth0" netns="/var/run/netns/cni-d97ae6db-78ae-9ef0-0122-1bfb0c6f3bb1" Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.567 [INFO][4442] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" iface="eth0" netns="/var/run/netns/cni-d97ae6db-78ae-9ef0-0122-1bfb0c6f3bb1" Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.567 [INFO][4442] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.567 [INFO][4442] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.609 [INFO][4460] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" HandleID="k8s-pod-network.ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.609 [INFO][4460] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.619 [INFO][4460] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.626 [WARNING][4460] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" HandleID="k8s-pod-network.ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.626 [INFO][4460] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" HandleID="k8s-pod-network.ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.632 [INFO][4460] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:25.638505 containerd[1519]: 2026-04-24 23:37:25.635 [INFO][4442] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:25.640329 containerd[1519]: time="2026-04-24T23:37:25.640306874Z" level=info msg="TearDown network for sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\" successfully" Apr 24 23:37:25.640329 containerd[1519]: time="2026-04-24T23:37:25.640328004Z" level=info msg="StopPodSandbox for \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\" returns successfully" Apr 24 23:37:25.641054 systemd[1]: run-netns-cni\x2dd97ae6db\x2d78ae\x2d9ef0\x2d0122\x2d1bfb0c6f3bb1.mount: Deactivated successfully. Apr 24 23:37:25.642434 containerd[1519]: time="2026-04-24T23:37:25.642363315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-j6dz5,Uid:133141ef-2c84-4659-9f62-e85180cc6f03,Namespace:calico-system,Attempt:1,}" Apr 24 23:37:25.769538 systemd-networkd[1422]: cali4a65f30faf4: Link UP Apr 24 23:37:25.770049 systemd-networkd[1422]: cali4a65f30faf4: Gained carrier Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.685 [INFO][4482] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0 csi-node-driver- calico-system b8636bb0-afe6-4f7c-8c11-499ada35dc8f 937 0 2026-04-24 23:36:58 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-6-n-6f01bfed3c csi-node-driver-kfkvn eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali4a65f30faf4 [] [] }} ContainerID="4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" Namespace="calico-system" Pod="csi-node-driver-kfkvn" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.685 [INFO][4482] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" Namespace="calico-system" Pod="csi-node-driver-kfkvn" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.731 [INFO][4514] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" HandleID="k8s-pod-network.4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.736 [INFO][4514] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" HandleID="k8s-pod-network.4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000380090), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-6f01bfed3c", "pod":"csi-node-driver-kfkvn", "timestamp":"2026-04-24 23:37:25.731254896 +0000 UTC"}, Hostname:"ci-4081-3-6-n-6f01bfed3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000377ce0)} Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.736 [INFO][4514] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.736 [INFO][4514] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.737 [INFO][4514] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-6f01bfed3c' Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.739 [INFO][4514] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.742 [INFO][4514] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.746 [INFO][4514] ipam/ipam.go 526: Trying affinity for 192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.749 [INFO][4514] ipam/ipam.go 160: Attempting to load block cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.751 [INFO][4514] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.751 [INFO][4514] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.752 [INFO][4514] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682 Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.756 [INFO][4514] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.761 [INFO][4514] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.35.66/26] block=192.168.35.64/26 handle="k8s-pod-network.4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.761 [INFO][4514] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.35.66/26] handle="k8s-pod-network.4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.761 [INFO][4514] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:25.784287 containerd[1519]: 2026-04-24 23:37:25.761 [INFO][4514] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.35.66/26] IPv6=[] ContainerID="4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" HandleID="k8s-pod-network.4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:25.785322 containerd[1519]: 2026-04-24 23:37:25.765 [INFO][4482] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" Namespace="calico-system" Pod="csi-node-driver-kfkvn" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b8636bb0-afe6-4f7c-8c11-499ada35dc8f", ResourceVersion:"937", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"", Pod:"csi-node-driver-kfkvn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.35.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4a65f30faf4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:25.785322 containerd[1519]: 2026-04-24 23:37:25.765 [INFO][4482] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.66/32] ContainerID="4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" Namespace="calico-system" Pod="csi-node-driver-kfkvn" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:25.785322 containerd[1519]: 2026-04-24 23:37:25.766 [INFO][4482] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4a65f30faf4 ContainerID="4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" Namespace="calico-system" Pod="csi-node-driver-kfkvn" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:25.785322 containerd[1519]: 2026-04-24 23:37:25.771 [INFO][4482] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" Namespace="calico-system" Pod="csi-node-driver-kfkvn" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:25.785322 containerd[1519]: 2026-04-24 23:37:25.772 [INFO][4482] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" Namespace="calico-system" Pod="csi-node-driver-kfkvn" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b8636bb0-afe6-4f7c-8c11-499ada35dc8f", ResourceVersion:"937", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682", Pod:"csi-node-driver-kfkvn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.35.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4a65f30faf4", MAC:"76:41:57:ea:10:fd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:25.785322 containerd[1519]: 2026-04-24 23:37:25.781 [INFO][4482] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682" Namespace="calico-system" Pod="csi-node-driver-kfkvn" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:25.801584 containerd[1519]: time="2026-04-24T23:37:25.801512365Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:37:25.801682 containerd[1519]: time="2026-04-24T23:37:25.801558539Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:37:25.801682 containerd[1519]: time="2026-04-24T23:37:25.801656252Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:25.801877 containerd[1519]: time="2026-04-24T23:37:25.801838506Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:25.823526 systemd[1]: Started cri-containerd-4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682.scope - libcontainer container 4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682. Apr 24 23:37:25.843233 containerd[1519]: time="2026-04-24T23:37:25.843001723Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kfkvn,Uid:b8636bb0-afe6-4f7c-8c11-499ada35dc8f,Namespace:calico-system,Attempt:1,} returns sandbox id \"4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682\"" Apr 24 23:37:25.846889 containerd[1519]: time="2026-04-24T23:37:25.846859971Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Apr 24 23:37:25.869817 systemd-networkd[1422]: cali90cc7c488ec: Link UP Apr 24 23:37:25.869986 systemd-networkd[1422]: cali90cc7c488ec: Gained carrier Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.703 [INFO][4496] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0 goldmane-9f7667bb8- calico-system 133141ef-2c84-4659-9f62-e85180cc6f03 936 0 2026-04-24 23:36:58 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-6-n-6f01bfed3c goldmane-9f7667bb8-j6dz5 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali90cc7c488ec [] [] }} ContainerID="53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" Namespace="calico-system" Pod="goldmane-9f7667bb8-j6dz5" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.703 [INFO][4496] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" Namespace="calico-system" Pod="goldmane-9f7667bb8-j6dz5" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.759 [INFO][4522] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" HandleID="k8s-pod-network.53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.770 [INFO][4522] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" HandleID="k8s-pod-network.53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e5f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-6f01bfed3c", "pod":"goldmane-9f7667bb8-j6dz5", "timestamp":"2026-04-24 23:37:25.759808102 +0000 UTC"}, Hostname:"ci-4081-3-6-n-6f01bfed3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000378580)} Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.770 [INFO][4522] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.770 [INFO][4522] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.770 [INFO][4522] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-6f01bfed3c' Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.840 [INFO][4522] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.847 [INFO][4522] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.851 [INFO][4522] ipam/ipam.go 526: Trying affinity for 192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.853 [INFO][4522] ipam/ipam.go 160: Attempting to load block cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.854 [INFO][4522] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.854 [INFO][4522] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.855 [INFO][4522] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8 Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.858 [INFO][4522] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.863 [INFO][4522] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.35.67/26] block=192.168.35.64/26 handle="k8s-pod-network.53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.863 [INFO][4522] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.35.67/26] handle="k8s-pod-network.53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.863 [INFO][4522] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:25.887786 containerd[1519]: 2026-04-24 23:37:25.863 [INFO][4522] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.35.67/26] IPv6=[] ContainerID="53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" HandleID="k8s-pod-network.53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:25.888616 containerd[1519]: 2026-04-24 23:37:25.866 [INFO][4496] cni-plugin/k8s.go 418: Populated endpoint ContainerID="53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" Namespace="calico-system" Pod="goldmane-9f7667bb8-j6dz5" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"133141ef-2c84-4659-9f62-e85180cc6f03", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"", Pod:"goldmane-9f7667bb8-j6dz5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.35.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali90cc7c488ec", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:25.888616 containerd[1519]: 2026-04-24 23:37:25.866 [INFO][4496] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.67/32] ContainerID="53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" Namespace="calico-system" Pod="goldmane-9f7667bb8-j6dz5" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:25.888616 containerd[1519]: 2026-04-24 23:37:25.866 [INFO][4496] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali90cc7c488ec ContainerID="53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" Namespace="calico-system" Pod="goldmane-9f7667bb8-j6dz5" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:25.888616 containerd[1519]: 2026-04-24 23:37:25.871 [INFO][4496] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" Namespace="calico-system" Pod="goldmane-9f7667bb8-j6dz5" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:25.888616 containerd[1519]: 2026-04-24 23:37:25.873 [INFO][4496] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" Namespace="calico-system" Pod="goldmane-9f7667bb8-j6dz5" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"133141ef-2c84-4659-9f62-e85180cc6f03", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8", Pod:"goldmane-9f7667bb8-j6dz5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.35.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali90cc7c488ec", MAC:"a6:25:c6:05:90:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:25.888616 containerd[1519]: 2026-04-24 23:37:25.883 [INFO][4496] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8" Namespace="calico-system" Pod="goldmane-9f7667bb8-j6dz5" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:25.911728 containerd[1519]: time="2026-04-24T23:37:25.910486019Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:37:25.911728 containerd[1519]: time="2026-04-24T23:37:25.910540941Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:37:25.911728 containerd[1519]: time="2026-04-24T23:37:25.910551001Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:25.911728 containerd[1519]: time="2026-04-24T23:37:25.910614882Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:25.931548 systemd[1]: Started cri-containerd-53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8.scope - libcontainer container 53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8. Apr 24 23:37:25.977849 systemd-networkd[1422]: calif0e53319469: Link UP Apr 24 23:37:25.979642 systemd-networkd[1422]: calif0e53319469: Gained carrier Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.716 [INFO][4491] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0 coredns-7d764666f9- kube-system 9e14ced5-8d3f-4116-8156-faf06229fa0d 935 0 2026-04-24 23:36:48 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-6f01bfed3c coredns-7d764666f9-dr55n eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif0e53319469 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" Namespace="kube-system" Pod="coredns-7d764666f9-dr55n" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.717 [INFO][4491] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" Namespace="kube-system" Pod="coredns-7d764666f9-dr55n" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.763 [INFO][4527] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" HandleID="k8s-pod-network.bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.771 [INFO][4527] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" HandleID="k8s-pod-network.bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277e80), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-6f01bfed3c", "pod":"coredns-7d764666f9-dr55n", "timestamp":"2026-04-24 23:37:25.763145213 +0000 UTC"}, Hostname:"ci-4081-3-6-n-6f01bfed3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000256f20)} Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.771 [INFO][4527] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.863 [INFO][4527] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.864 [INFO][4527] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-6f01bfed3c' Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.941 [INFO][4527] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.946 [INFO][4527] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.952 [INFO][4527] ipam/ipam.go 526: Trying affinity for 192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.954 [INFO][4527] ipam/ipam.go 160: Attempting to load block cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.955 [INFO][4527] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.955 [INFO][4527] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.956 [INFO][4527] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079 Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.963 [INFO][4527] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.969 [INFO][4527] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.35.68/26] block=192.168.35.64/26 handle="k8s-pod-network.bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.969 [INFO][4527] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.35.68/26] handle="k8s-pod-network.bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.969 [INFO][4527] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:25.995457 containerd[1519]: 2026-04-24 23:37:25.969 [INFO][4527] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.35.68/26] IPv6=[] ContainerID="bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" HandleID="k8s-pod-network.bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:25.995913 containerd[1519]: 2026-04-24 23:37:25.972 [INFO][4491] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" Namespace="kube-system" Pod="coredns-7d764666f9-dr55n" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"9e14ced5-8d3f-4116-8156-faf06229fa0d", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"", Pod:"coredns-7d764666f9-dr55n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0e53319469", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:25.995913 containerd[1519]: 2026-04-24 23:37:25.972 [INFO][4491] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.68/32] ContainerID="bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" Namespace="kube-system" Pod="coredns-7d764666f9-dr55n" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:25.995913 containerd[1519]: 2026-04-24 23:37:25.972 [INFO][4491] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif0e53319469 ContainerID="bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" Namespace="kube-system" Pod="coredns-7d764666f9-dr55n" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:25.995913 containerd[1519]: 2026-04-24 23:37:25.978 [INFO][4491] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" Namespace="kube-system" Pod="coredns-7d764666f9-dr55n" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:25.995913 containerd[1519]: 2026-04-24 23:37:25.979 [INFO][4491] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" Namespace="kube-system" Pod="coredns-7d764666f9-dr55n" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"9e14ced5-8d3f-4116-8156-faf06229fa0d", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079", Pod:"coredns-7d764666f9-dr55n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0e53319469", MAC:"c6:13:00:34:3c:a8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:25.996070 containerd[1519]: 2026-04-24 23:37:25.991 [INFO][4491] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079" Namespace="kube-system" Pod="coredns-7d764666f9-dr55n" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:25.999331 containerd[1519]: time="2026-04-24T23:37:25.999288625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-j6dz5,Uid:133141ef-2c84-4659-9f62-e85180cc6f03,Namespace:calico-system,Attempt:1,} returns sandbox id \"53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8\"" Apr 24 23:37:26.017284 containerd[1519]: time="2026-04-24T23:37:26.016877071Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:37:26.017284 containerd[1519]: time="2026-04-24T23:37:26.016952865Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:37:26.017284 containerd[1519]: time="2026-04-24T23:37:26.016984611Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:26.017284 containerd[1519]: time="2026-04-24T23:37:26.017069692Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:26.036623 systemd[1]: Started cri-containerd-bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079.scope - libcontainer container bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079. Apr 24 23:37:26.070259 containerd[1519]: time="2026-04-24T23:37:26.070211506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-dr55n,Uid:9e14ced5-8d3f-4116-8156-faf06229fa0d,Namespace:kube-system,Attempt:1,} returns sandbox id \"bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079\"" Apr 24 23:37:26.075312 containerd[1519]: time="2026-04-24T23:37:26.075283157Z" level=info msg="CreateContainer within sandbox \"bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 24 23:37:26.089009 containerd[1519]: time="2026-04-24T23:37:26.088969825Z" level=info msg="CreateContainer within sandbox \"bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"15a3c854e5f498e9f176c36b8e4a7c7e42bc91cbd188789b0def17203253d478\"" Apr 24 23:37:26.090367 containerd[1519]: time="2026-04-24T23:37:26.089804962Z" level=info msg="StartContainer for \"15a3c854e5f498e9f176c36b8e4a7c7e42bc91cbd188789b0def17203253d478\"" Apr 24 23:37:26.115539 systemd[1]: Started cri-containerd-15a3c854e5f498e9f176c36b8e4a7c7e42bc91cbd188789b0def17203253d478.scope - libcontainer container 15a3c854e5f498e9f176c36b8e4a7c7e42bc91cbd188789b0def17203253d478. Apr 24 23:37:26.137319 containerd[1519]: time="2026-04-24T23:37:26.137284893Z" level=info msg="StartContainer for \"15a3c854e5f498e9f176c36b8e4a7c7e42bc91cbd188789b0def17203253d478\" returns successfully" Apr 24 23:37:26.752049 kubelet[2565]: I0424 23:37:26.751955 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-dr55n" podStartSLOduration=38.751937878 podStartE2EDuration="38.751937878s" podCreationTimestamp="2026-04-24 23:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:37:26.75144827 +0000 UTC m=+45.363507114" watchObservedRunningTime="2026-04-24 23:37:26.751937878 +0000 UTC m=+45.363996712" Apr 24 23:37:27.081657 systemd-networkd[1422]: cali4a65f30faf4: Gained IPv6LL Apr 24 23:37:27.337848 systemd-networkd[1422]: cali90cc7c488ec: Gained IPv6LL Apr 24 23:37:27.480461 containerd[1519]: time="2026-04-24T23:37:27.478716052Z" level=info msg="StopPodSandbox for \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\"" Apr 24 23:37:27.480461 containerd[1519]: time="2026-04-24T23:37:27.479791297Z" level=info msg="StopPodSandbox for \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\"" Apr 24 23:37:27.485106 containerd[1519]: time="2026-04-24T23:37:27.484788746Z" level=info msg="StopPodSandbox for \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\"" Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.564 [INFO][4798] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.565 [INFO][4798] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" iface="eth0" netns="/var/run/netns/cni-580ab76c-3777-f5a3-886b-3db4bdb5b60f" Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.565 [INFO][4798] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" iface="eth0" netns="/var/run/netns/cni-580ab76c-3777-f5a3-886b-3db4bdb5b60f" Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.566 [INFO][4798] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" iface="eth0" netns="/var/run/netns/cni-580ab76c-3777-f5a3-886b-3db4bdb5b60f" Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.566 [INFO][4798] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.566 [INFO][4798] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.608 [INFO][4817] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" HandleID="k8s-pod-network.580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.608 [INFO][4817] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.608 [INFO][4817] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.612 [WARNING][4817] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" HandleID="k8s-pod-network.580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.612 [INFO][4817] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" HandleID="k8s-pod-network.580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.614 [INFO][4817] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:27.625811 containerd[1519]: 2026-04-24 23:37:27.619 [INFO][4798] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:27.628474 systemd[1]: run-netns-cni\x2d580ab76c\x2d3777\x2df5a3\x2d886b\x2d3db4bdb5b60f.mount: Deactivated successfully. Apr 24 23:37:27.632034 containerd[1519]: time="2026-04-24T23:37:27.630860791Z" level=info msg="TearDown network for sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\" successfully" Apr 24 23:37:27.632034 containerd[1519]: time="2026-04-24T23:37:27.630893105Z" level=info msg="StopPodSandbox for \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\" returns successfully" Apr 24 23:37:27.634121 containerd[1519]: time="2026-04-24T23:37:27.634087457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6df7864b-wqqbk,Uid:7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6,Namespace:calico-system,Attempt:1,}" Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.574 [INFO][4797] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.575 [INFO][4797] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" iface="eth0" netns="/var/run/netns/cni-4b2b9ca8-ecf4-d2c1-d0ae-88c0d06de311" Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.576 [INFO][4797] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" iface="eth0" netns="/var/run/netns/cni-4b2b9ca8-ecf4-d2c1-d0ae-88c0d06de311" Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.577 [INFO][4797] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" iface="eth0" netns="/var/run/netns/cni-4b2b9ca8-ecf4-d2c1-d0ae-88c0d06de311" Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.577 [INFO][4797] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.577 [INFO][4797] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.626 [INFO][4822] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" HandleID="k8s-pod-network.e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.626 [INFO][4822] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.626 [INFO][4822] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.635 [WARNING][4822] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" HandleID="k8s-pod-network.e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.635 [INFO][4822] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" HandleID="k8s-pod-network.e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.636 [INFO][4822] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:27.648764 containerd[1519]: 2026-04-24 23:37:27.640 [INFO][4797] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:27.655166 systemd[1]: run-netns-cni\x2d4b2b9ca8\x2decf4\x2dd2c1\x2dd0ae\x2d88c0d06de311.mount: Deactivated successfully. Apr 24 23:37:27.656230 containerd[1519]: time="2026-04-24T23:37:27.655686647Z" level=info msg="TearDown network for sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\" successfully" Apr 24 23:37:27.656309 containerd[1519]: time="2026-04-24T23:37:27.656294361Z" level=info msg="StopPodSandbox for \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\" returns successfully" Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.579 [INFO][4793] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.579 [INFO][4793] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" iface="eth0" netns="/var/run/netns/cni-83e4edb8-f96b-2855-af4e-cfef7f3c743b" Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.579 [INFO][4793] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" iface="eth0" netns="/var/run/netns/cni-83e4edb8-f96b-2855-af4e-cfef7f3c743b" Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.580 [INFO][4793] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" iface="eth0" netns="/var/run/netns/cni-83e4edb8-f96b-2855-af4e-cfef7f3c743b" Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.580 [INFO][4793] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.580 [INFO][4793] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.629 [INFO][4826] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" HandleID="k8s-pod-network.fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.630 [INFO][4826] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.636 [INFO][4826] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.643 [WARNING][4826] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" HandleID="k8s-pod-network.fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.643 [INFO][4826] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" HandleID="k8s-pod-network.fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.644 [INFO][4826] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:27.658315 containerd[1519]: 2026-04-24 23:37:27.650 [INFO][4793] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:27.661508 containerd[1519]: time="2026-04-24T23:37:27.658480077Z" level=info msg="TearDown network for sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\" successfully" Apr 24 23:37:27.661508 containerd[1519]: time="2026-04-24T23:37:27.658492146Z" level=info msg="StopPodSandbox for \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\" returns successfully" Apr 24 23:37:27.661508 containerd[1519]: time="2026-04-24T23:37:27.660040086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b8b79f554-9kfjb,Uid:31b59c84-3007-40dc-82d8-2444bb185840,Namespace:calico-system,Attempt:1,}" Apr 24 23:37:27.661613 systemd-networkd[1422]: calif0e53319469: Gained IPv6LL Apr 24 23:37:27.663155 systemd[1]: run-netns-cni\x2d83e4edb8\x2df96b\x2d2855\x2daf4e\x2dcfef7f3c743b.mount: Deactivated successfully. Apr 24 23:37:27.663362 containerd[1519]: time="2026-04-24T23:37:27.663346731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-rvpdv,Uid:df938cae-ab41-409b-8afe-e6b3147b7b45,Namespace:kube-system,Attempt:1,}" Apr 24 23:37:27.823003 systemd-networkd[1422]: cali3d2bab2982d: Link UP Apr 24 23:37:27.824517 systemd-networkd[1422]: cali3d2bab2982d: Gained carrier Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.712 [INFO][4842] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0 calico-kube-controllers-5c6df7864b- calico-system 7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6 966 0 2026-04-24 23:36:58 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5c6df7864b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-6-n-6f01bfed3c calico-kube-controllers-5c6df7864b-wqqbk eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali3d2bab2982d [] [] }} ContainerID="723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" Namespace="calico-system" Pod="calico-kube-controllers-5c6df7864b-wqqbk" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.712 [INFO][4842] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" Namespace="calico-system" Pod="calico-kube-controllers-5c6df7864b-wqqbk" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.756 [INFO][4880] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" HandleID="k8s-pod-network.723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.763 [INFO][4880] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" HandleID="k8s-pod-network.723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277ac0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-6f01bfed3c", "pod":"calico-kube-controllers-5c6df7864b-wqqbk", "timestamp":"2026-04-24 23:37:27.756604863 +0000 UTC"}, Hostname:"ci-4081-3-6-n-6f01bfed3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00026cc60)} Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.763 [INFO][4880] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.763 [INFO][4880] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.763 [INFO][4880] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-6f01bfed3c' Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.767 [INFO][4880] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.774 [INFO][4880] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.778 [INFO][4880] ipam/ipam.go 526: Trying affinity for 192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.780 [INFO][4880] ipam/ipam.go 160: Attempting to load block cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.782 [INFO][4880] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.782 [INFO][4880] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.786 [INFO][4880] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586 Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.791 [INFO][4880] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.802 [INFO][4880] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.35.69/26] block=192.168.35.64/26 handle="k8s-pod-network.723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.802 [INFO][4880] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.35.69/26] handle="k8s-pod-network.723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.803 [INFO][4880] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:27.843085 containerd[1519]: 2026-04-24 23:37:27.803 [INFO][4880] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.35.69/26] IPv6=[] ContainerID="723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" HandleID="k8s-pod-network.723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:27.843554 containerd[1519]: 2026-04-24 23:37:27.808 [INFO][4842] cni-plugin/k8s.go 418: Populated endpoint ContainerID="723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" Namespace="calico-system" Pod="calico-kube-controllers-5c6df7864b-wqqbk" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0", GenerateName:"calico-kube-controllers-5c6df7864b-", Namespace:"calico-system", SelfLink:"", UID:"7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5c6df7864b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"", Pod:"calico-kube-controllers-5c6df7864b-wqqbk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.35.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3d2bab2982d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:27.843554 containerd[1519]: 2026-04-24 23:37:27.809 [INFO][4842] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.69/32] ContainerID="723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" Namespace="calico-system" Pod="calico-kube-controllers-5c6df7864b-wqqbk" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:27.843554 containerd[1519]: 2026-04-24 23:37:27.809 [INFO][4842] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3d2bab2982d ContainerID="723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" Namespace="calico-system" Pod="calico-kube-controllers-5c6df7864b-wqqbk" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:27.843554 containerd[1519]: 2026-04-24 23:37:27.826 [INFO][4842] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" Namespace="calico-system" Pod="calico-kube-controllers-5c6df7864b-wqqbk" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:27.843554 containerd[1519]: 2026-04-24 23:37:27.827 [INFO][4842] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" Namespace="calico-system" Pod="calico-kube-controllers-5c6df7864b-wqqbk" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0", GenerateName:"calico-kube-controllers-5c6df7864b-", Namespace:"calico-system", SelfLink:"", UID:"7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5c6df7864b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586", Pod:"calico-kube-controllers-5c6df7864b-wqqbk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.35.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3d2bab2982d", MAC:"f2:f5:11:4b:9d:e5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:27.843554 containerd[1519]: 2026-04-24 23:37:27.838 [INFO][4842] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586" Namespace="calico-system" Pod="calico-kube-controllers-5c6df7864b-wqqbk" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:27.889691 containerd[1519]: time="2026-04-24T23:37:27.888606134Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:37:27.889691 containerd[1519]: time="2026-04-24T23:37:27.888652057Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:37:27.889691 containerd[1519]: time="2026-04-24T23:37:27.888659943Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:27.889691 containerd[1519]: time="2026-04-24T23:37:27.888734457Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:27.920970 systemd-networkd[1422]: cali84144e67349: Link UP Apr 24 23:37:27.924084 systemd-networkd[1422]: cali84144e67349: Gained carrier Apr 24 23:37:27.938729 systemd[1]: Started cri-containerd-723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586.scope - libcontainer container 723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586. Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.758 [INFO][4855] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0 coredns-7d764666f9- kube-system df938cae-ab41-409b-8afe-e6b3147b7b45 968 0 2026-04-24 23:36:48 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-6f01bfed3c coredns-7d764666f9-rvpdv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali84144e67349 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" Namespace="kube-system" Pod="coredns-7d764666f9-rvpdv" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.759 [INFO][4855] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" Namespace="kube-system" Pod="coredns-7d764666f9-rvpdv" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.802 [INFO][4893] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" HandleID="k8s-pod-network.68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.818 [INFO][4893] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" HandleID="k8s-pod-network.68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000380650), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-6f01bfed3c", "pod":"coredns-7d764666f9-rvpdv", "timestamp":"2026-04-24 23:37:27.802953124 +0000 UTC"}, Hostname:"ci-4081-3-6-n-6f01bfed3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001886e0)} Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.818 [INFO][4893] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.818 [INFO][4893] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.818 [INFO][4893] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-6f01bfed3c' Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.867 [INFO][4893] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.873 [INFO][4893] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.880 [INFO][4893] ipam/ipam.go 526: Trying affinity for 192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.882 [INFO][4893] ipam/ipam.go 160: Attempting to load block cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.885 [INFO][4893] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.886 [INFO][4893] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.888 [INFO][4893] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01 Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.894 [INFO][4893] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.902 [INFO][4893] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.35.70/26] block=192.168.35.64/26 handle="k8s-pod-network.68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.902 [INFO][4893] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.35.70/26] handle="k8s-pod-network.68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.902 [INFO][4893] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:27.947296 containerd[1519]: 2026-04-24 23:37:27.903 [INFO][4893] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.35.70/26] IPv6=[] ContainerID="68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" HandleID="k8s-pod-network.68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:27.947800 containerd[1519]: 2026-04-24 23:37:27.908 [INFO][4855] cni-plugin/k8s.go 418: Populated endpoint ContainerID="68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" Namespace="kube-system" Pod="coredns-7d764666f9-rvpdv" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"df938cae-ab41-409b-8afe-e6b3147b7b45", ResourceVersion:"968", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"", Pod:"coredns-7d764666f9-rvpdv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali84144e67349", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:27.947800 containerd[1519]: 2026-04-24 23:37:27.909 [INFO][4855] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.70/32] ContainerID="68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" Namespace="kube-system" Pod="coredns-7d764666f9-rvpdv" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:27.947800 containerd[1519]: 2026-04-24 23:37:27.909 [INFO][4855] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali84144e67349 ContainerID="68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" Namespace="kube-system" Pod="coredns-7d764666f9-rvpdv" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:27.947800 containerd[1519]: 2026-04-24 23:37:27.923 [INFO][4855] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" Namespace="kube-system" Pod="coredns-7d764666f9-rvpdv" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:27.947800 containerd[1519]: 2026-04-24 23:37:27.925 [INFO][4855] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" Namespace="kube-system" Pod="coredns-7d764666f9-rvpdv" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"df938cae-ab41-409b-8afe-e6b3147b7b45", ResourceVersion:"968", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01", Pod:"coredns-7d764666f9-rvpdv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali84144e67349", MAC:"4a:05:74:e6:00:e0", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:27.947933 containerd[1519]: 2026-04-24 23:37:27.944 [INFO][4855] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01" Namespace="kube-system" Pod="coredns-7d764666f9-rvpdv" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:27.950660 containerd[1519]: time="2026-04-24T23:37:27.949968377Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:27.951404 containerd[1519]: time="2026-04-24T23:37:27.951370582Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Apr 24 23:37:27.952243 containerd[1519]: time="2026-04-24T23:37:27.952175869Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:27.955557 containerd[1519]: time="2026-04-24T23:37:27.955522963Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:27.956428 containerd[1519]: time="2026-04-24T23:37:27.956398013Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 2.109413625s" Apr 24 23:37:27.956495 containerd[1519]: time="2026-04-24T23:37:27.956484586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Apr 24 23:37:27.958575 containerd[1519]: time="2026-04-24T23:37:27.958557309Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Apr 24 23:37:27.960579 containerd[1519]: time="2026-04-24T23:37:27.960561462Z" level=info msg="CreateContainer within sandbox \"4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Apr 24 23:37:27.992489 containerd[1519]: time="2026-04-24T23:37:27.990708214Z" level=info msg="CreateContainer within sandbox \"4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"cd000ba1dc2e990e3d5d2d92e56f03de796ebd8114283a43da771c1e9bf80a04\"" Apr 24 23:37:27.992853 containerd[1519]: time="2026-04-24T23:37:27.992828272Z" level=info msg="StartContainer for \"cd000ba1dc2e990e3d5d2d92e56f03de796ebd8114283a43da771c1e9bf80a04\"" Apr 24 23:37:28.002023 containerd[1519]: time="2026-04-24T23:37:28.001920985Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:37:28.002224 containerd[1519]: time="2026-04-24T23:37:28.002089392Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:37:28.003644 containerd[1519]: time="2026-04-24T23:37:28.002125475Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:28.003644 containerd[1519]: time="2026-04-24T23:37:28.003111215Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:28.027186 systemd-networkd[1422]: calib3bbacbc562: Link UP Apr 24 23:37:28.033370 systemd-networkd[1422]: calib3bbacbc562: Gained carrier Apr 24 23:37:28.049100 systemd[1]: Started cri-containerd-cd000ba1dc2e990e3d5d2d92e56f03de796ebd8114283a43da771c1e9bf80a04.scope - libcontainer container cd000ba1dc2e990e3d5d2d92e56f03de796ebd8114283a43da771c1e9bf80a04. Apr 24 23:37:28.069808 systemd[1]: Started cri-containerd-68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01.scope - libcontainer container 68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01. Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.757 [INFO][4856] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0 calico-apiserver-b8b79f554- calico-system 31b59c84-3007-40dc-82d8-2444bb185840 967 0 2026-04-24 23:36:57 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b8b79f554 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-6f01bfed3c calico-apiserver-b8b79f554-9kfjb eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calib3bbacbc562 [] [] }} ContainerID="d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-9kfjb" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.758 [INFO][4856] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-9kfjb" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.809 [INFO][4891] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" HandleID="k8s-pod-network.d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.825 [INFO][4891] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" HandleID="k8s-pod-network.d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e490), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-6f01bfed3c", "pod":"calico-apiserver-b8b79f554-9kfjb", "timestamp":"2026-04-24 23:37:27.809235091 +0000 UTC"}, Hostname:"ci-4081-3-6-n-6f01bfed3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002e4dc0)} Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.825 [INFO][4891] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.903 [INFO][4891] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.903 [INFO][4891] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-6f01bfed3c' Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.967 [INFO][4891] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.975 [INFO][4891] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.981 [INFO][4891] ipam/ipam.go 526: Trying affinity for 192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.983 [INFO][4891] ipam/ipam.go 160: Attempting to load block cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.986 [INFO][4891] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.987 [INFO][4891] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.989 [INFO][4891] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4 Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:27.997 [INFO][4891] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:28.005 [INFO][4891] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.35.71/26] block=192.168.35.64/26 handle="k8s-pod-network.d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:28.005 [INFO][4891] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.35.71/26] handle="k8s-pod-network.d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:28.005 [INFO][4891] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:28.076393 containerd[1519]: 2026-04-24 23:37:28.005 [INFO][4891] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.35.71/26] IPv6=[] ContainerID="d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" HandleID="k8s-pod-network.d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:28.076930 containerd[1519]: 2026-04-24 23:37:28.019 [INFO][4856] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-9kfjb" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0", GenerateName:"calico-apiserver-b8b79f554-", Namespace:"calico-system", SelfLink:"", UID:"31b59c84-3007-40dc-82d8-2444bb185840", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b8b79f554", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"", Pod:"calico-apiserver-b8b79f554-9kfjb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib3bbacbc562", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:28.076930 containerd[1519]: 2026-04-24 23:37:28.019 [INFO][4856] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.71/32] ContainerID="d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-9kfjb" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:28.076930 containerd[1519]: 2026-04-24 23:37:28.019 [INFO][4856] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib3bbacbc562 ContainerID="d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-9kfjb" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:28.076930 containerd[1519]: 2026-04-24 23:37:28.035 [INFO][4856] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-9kfjb" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:28.076930 containerd[1519]: 2026-04-24 23:37:28.044 [INFO][4856] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-9kfjb" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0", GenerateName:"calico-apiserver-b8b79f554-", Namespace:"calico-system", SelfLink:"", UID:"31b59c84-3007-40dc-82d8-2444bb185840", ResourceVersion:"967", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b8b79f554", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4", Pod:"calico-apiserver-b8b79f554-9kfjb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib3bbacbc562", MAC:"8a:c2:6d:a4:72:d2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:28.076930 containerd[1519]: 2026-04-24 23:37:28.062 [INFO][4856] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-9kfjb" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:28.094093 containerd[1519]: time="2026-04-24T23:37:28.092490052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6df7864b-wqqbk,Uid:7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6,Namespace:calico-system,Attempt:1,} returns sandbox id \"723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586\"" Apr 24 23:37:28.118776 containerd[1519]: time="2026-04-24T23:37:28.118732612Z" level=info msg="StartContainer for \"cd000ba1dc2e990e3d5d2d92e56f03de796ebd8114283a43da771c1e9bf80a04\" returns successfully" Apr 24 23:37:28.127734 containerd[1519]: time="2026-04-24T23:37:28.127654173Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:37:28.127734 containerd[1519]: time="2026-04-24T23:37:28.127698201Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:37:28.127734 containerd[1519]: time="2026-04-24T23:37:28.127708568Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:28.128334 containerd[1519]: time="2026-04-24T23:37:28.128285507Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:28.144774 containerd[1519]: time="2026-04-24T23:37:28.144629141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-rvpdv,Uid:df938cae-ab41-409b-8afe-e6b3147b7b45,Namespace:kube-system,Attempt:1,} returns sandbox id \"68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01\"" Apr 24 23:37:28.153948 systemd[1]: Started cri-containerd-d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4.scope - libcontainer container d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4. Apr 24 23:37:28.154991 containerd[1519]: time="2026-04-24T23:37:28.154764818Z" level=info msg="CreateContainer within sandbox \"68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 24 23:37:28.169370 containerd[1519]: time="2026-04-24T23:37:28.169340793Z" level=info msg="CreateContainer within sandbox \"68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b73e3e9befc1d68e71ace7baced28cfbfc2e7249e97ec5dce4429d4a30aabfea\"" Apr 24 23:37:28.170773 containerd[1519]: time="2026-04-24T23:37:28.170740647Z" level=info msg="StartContainer for \"b73e3e9befc1d68e71ace7baced28cfbfc2e7249e97ec5dce4429d4a30aabfea\"" Apr 24 23:37:28.213062 systemd[1]: Started cri-containerd-b73e3e9befc1d68e71ace7baced28cfbfc2e7249e97ec5dce4429d4a30aabfea.scope - libcontainer container b73e3e9befc1d68e71ace7baced28cfbfc2e7249e97ec5dce4429d4a30aabfea. Apr 24 23:37:28.220035 containerd[1519]: time="2026-04-24T23:37:28.219945642Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b8b79f554-9kfjb,Uid:31b59c84-3007-40dc-82d8-2444bb185840,Namespace:calico-system,Attempt:1,} returns sandbox id \"d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4\"" Apr 24 23:37:28.238730 containerd[1519]: time="2026-04-24T23:37:28.238670848Z" level=info msg="StartContainer for \"b73e3e9befc1d68e71ace7baced28cfbfc2e7249e97ec5dce4429d4a30aabfea\" returns successfully" Apr 24 23:37:28.477946 containerd[1519]: time="2026-04-24T23:37:28.477844095Z" level=info msg="StopPodSandbox for \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\"" Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.541 [INFO][5157] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.541 [INFO][5157] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" iface="eth0" netns="/var/run/netns/cni-d2d0ef64-7958-693a-2dd1-2a2ccf4d0d94" Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.542 [INFO][5157] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" iface="eth0" netns="/var/run/netns/cni-d2d0ef64-7958-693a-2dd1-2a2ccf4d0d94" Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.543 [INFO][5157] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" iface="eth0" netns="/var/run/netns/cni-d2d0ef64-7958-693a-2dd1-2a2ccf4d0d94" Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.543 [INFO][5157] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.543 [INFO][5157] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.569 [INFO][5164] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" HandleID="k8s-pod-network.436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.569 [INFO][5164] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.569 [INFO][5164] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.575 [WARNING][5164] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" HandleID="k8s-pod-network.436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.575 [INFO][5164] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" HandleID="k8s-pod-network.436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.576 [INFO][5164] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:28.580950 containerd[1519]: 2026-04-24 23:37:28.578 [INFO][5157] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:28.581648 containerd[1519]: time="2026-04-24T23:37:28.581608443Z" level=info msg="TearDown network for sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\" successfully" Apr 24 23:37:28.581648 containerd[1519]: time="2026-04-24T23:37:28.581633421Z" level=info msg="StopPodSandbox for \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\" returns successfully" Apr 24 23:37:28.583881 containerd[1519]: time="2026-04-24T23:37:28.583854142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b8b79f554-vhwl6,Uid:7bc41b28-ad74-4dc3-b94f-b24620d3a9ff,Namespace:calico-system,Attempt:1,}" Apr 24 23:37:28.632874 systemd[1]: run-netns-cni\x2dd2d0ef64\x2d7958\x2d693a\x2d2dd1\x2d2a2ccf4d0d94.mount: Deactivated successfully. Apr 24 23:37:28.679152 systemd-networkd[1422]: cali6740876887e: Link UP Apr 24 23:37:28.681745 systemd-networkd[1422]: cali6740876887e: Gained carrier Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.619 [INFO][5170] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0 calico-apiserver-b8b79f554- calico-system 7bc41b28-ad74-4dc3-b94f-b24620d3a9ff 989 0 2026-04-24 23:36:57 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b8b79f554 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-6f01bfed3c calico-apiserver-b8b79f554-vhwl6 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali6740876887e [] [] }} ContainerID="239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-vhwl6" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.619 [INFO][5170] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-vhwl6" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.648 [INFO][5182] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" HandleID="k8s-pod-network.239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.653 [INFO][5182] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" HandleID="k8s-pod-network.239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd110), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-6f01bfed3c", "pod":"calico-apiserver-b8b79f554-vhwl6", "timestamp":"2026-04-24 23:37:28.648528279 +0000 UTC"}, Hostname:"ci-4081-3-6-n-6f01bfed3c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00022a420)} Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.653 [INFO][5182] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.653 [INFO][5182] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.653 [INFO][5182] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-6f01bfed3c' Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.655 [INFO][5182] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.658 [INFO][5182] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.661 [INFO][5182] ipam/ipam.go 526: Trying affinity for 192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.663 [INFO][5182] ipam/ipam.go 160: Attempting to load block cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.664 [INFO][5182] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.35.64/26 host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.664 [INFO][5182] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.35.64/26 handle="k8s-pod-network.239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.665 [INFO][5182] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.669 [INFO][5182] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.35.64/26 handle="k8s-pod-network.239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.674 [INFO][5182] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.35.72/26] block=192.168.35.64/26 handle="k8s-pod-network.239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.674 [INFO][5182] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.35.72/26] handle="k8s-pod-network.239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" host="ci-4081-3-6-n-6f01bfed3c" Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.674 [INFO][5182] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:28.697838 containerd[1519]: 2026-04-24 23:37:28.674 [INFO][5182] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.35.72/26] IPv6=[] ContainerID="239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" HandleID="k8s-pod-network.239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:28.698270 containerd[1519]: 2026-04-24 23:37:28.676 [INFO][5170] cni-plugin/k8s.go 418: Populated endpoint ContainerID="239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-vhwl6" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0", GenerateName:"calico-apiserver-b8b79f554-", Namespace:"calico-system", SelfLink:"", UID:"7bc41b28-ad74-4dc3-b94f-b24620d3a9ff", ResourceVersion:"989", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b8b79f554", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"", Pod:"calico-apiserver-b8b79f554-vhwl6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali6740876887e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:28.698270 containerd[1519]: 2026-04-24 23:37:28.676 [INFO][5170] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.35.72/32] ContainerID="239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-vhwl6" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:28.698270 containerd[1519]: 2026-04-24 23:37:28.676 [INFO][5170] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6740876887e ContainerID="239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-vhwl6" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:28.698270 containerd[1519]: 2026-04-24 23:37:28.679 [INFO][5170] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-vhwl6" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:28.698270 containerd[1519]: 2026-04-24 23:37:28.679 [INFO][5170] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-vhwl6" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0", GenerateName:"calico-apiserver-b8b79f554-", Namespace:"calico-system", SelfLink:"", UID:"7bc41b28-ad74-4dc3-b94f-b24620d3a9ff", ResourceVersion:"989", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b8b79f554", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e", Pod:"calico-apiserver-b8b79f554-vhwl6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali6740876887e", MAC:"2a:8e:25:f9:5e:35", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:28.698270 containerd[1519]: 2026-04-24 23:37:28.691 [INFO][5170] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e" Namespace="calico-system" Pod="calico-apiserver-b8b79f554-vhwl6" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:28.717614 containerd[1519]: time="2026-04-24T23:37:28.717298467Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:37:28.717614 containerd[1519]: time="2026-04-24T23:37:28.717346551Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:37:28.717614 containerd[1519]: time="2026-04-24T23:37:28.717381257Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:28.717614 containerd[1519]: time="2026-04-24T23:37:28.717479027Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:37:28.749618 systemd[1]: Started cri-containerd-239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e.scope - libcontainer container 239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e. Apr 24 23:37:28.775089 kubelet[2565]: I0424 23:37:28.775024 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-rvpdv" podStartSLOduration=40.775012105 podStartE2EDuration="40.775012105s" podCreationTimestamp="2026-04-24 23:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:37:28.762923277 +0000 UTC m=+47.374982082" watchObservedRunningTime="2026-04-24 23:37:28.775012105 +0000 UTC m=+47.387070909" Apr 24 23:37:28.811814 containerd[1519]: time="2026-04-24T23:37:28.809667250Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b8b79f554-vhwl6,Uid:7bc41b28-ad74-4dc3-b94f-b24620d3a9ff,Namespace:calico-system,Attempt:1,} returns sandbox id \"239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e\"" Apr 24 23:37:29.321891 systemd-networkd[1422]: cali84144e67349: Gained IPv6LL Apr 24 23:37:29.450459 systemd-networkd[1422]: calib3bbacbc562: Gained IPv6LL Apr 24 23:37:29.513570 systemd-networkd[1422]: cali3d2bab2982d: Gained IPv6LL Apr 24 23:37:29.981425 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4282207575.mount: Deactivated successfully. Apr 24 23:37:30.025752 systemd-networkd[1422]: cali6740876887e: Gained IPv6LL Apr 24 23:37:30.280066 containerd[1519]: time="2026-04-24T23:37:30.279699701Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:30.281074 containerd[1519]: time="2026-04-24T23:37:30.280948918Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Apr 24 23:37:30.281974 containerd[1519]: time="2026-04-24T23:37:30.281931320Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:30.284264 containerd[1519]: time="2026-04-24T23:37:30.284225765Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:30.285006 containerd[1519]: time="2026-04-24T23:37:30.284973653Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 2.326312708s" Apr 24 23:37:30.285006 containerd[1519]: time="2026-04-24T23:37:30.285000283Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Apr 24 23:37:30.286360 containerd[1519]: time="2026-04-24T23:37:30.286336504Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Apr 24 23:37:30.289284 containerd[1519]: time="2026-04-24T23:37:30.289247092Z" level=info msg="CreateContainer within sandbox \"53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Apr 24 23:37:30.303611 containerd[1519]: time="2026-04-24T23:37:30.303574202Z" level=info msg="CreateContainer within sandbox \"53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"a3d8f377d11eb7062ec0f521abcb3ab13731a17b6669ec666779e5ffd9e61b85\"" Apr 24 23:37:30.304482 containerd[1519]: time="2026-04-24T23:37:30.304345118Z" level=info msg="StartContainer for \"a3d8f377d11eb7062ec0f521abcb3ab13731a17b6669ec666779e5ffd9e61b85\"" Apr 24 23:37:30.342569 systemd[1]: Started cri-containerd-a3d8f377d11eb7062ec0f521abcb3ab13731a17b6669ec666779e5ffd9e61b85.scope - libcontainer container a3d8f377d11eb7062ec0f521abcb3ab13731a17b6669ec666779e5ffd9e61b85. Apr 24 23:37:30.383469 containerd[1519]: time="2026-04-24T23:37:30.382856001Z" level=info msg="StartContainer for \"a3d8f377d11eb7062ec0f521abcb3ab13731a17b6669ec666779e5ffd9e61b85\" returns successfully" Apr 24 23:37:31.861406 kubelet[2565]: I0424 23:37:31.861307 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-j6dz5" podStartSLOduration=29.576815531 podStartE2EDuration="33.861292994s" podCreationTimestamp="2026-04-24 23:36:58 +0000 UTC" firstStartedPulling="2026-04-24 23:37:26.00135288 +0000 UTC m=+44.613411684" lastFinishedPulling="2026-04-24 23:37:30.285830353 +0000 UTC m=+48.897889147" observedRunningTime="2026-04-24 23:37:30.78603247 +0000 UTC m=+49.398091304" watchObservedRunningTime="2026-04-24 23:37:31.861292994 +0000 UTC m=+50.473351788" Apr 24 23:37:34.577189 containerd[1519]: time="2026-04-24T23:37:34.577142232Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:34.578333 containerd[1519]: time="2026-04-24T23:37:34.578176218Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Apr 24 23:37:34.579386 containerd[1519]: time="2026-04-24T23:37:34.579313176Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:34.581272 containerd[1519]: time="2026-04-24T23:37:34.581240087Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:34.582050 containerd[1519]: time="2026-04-24T23:37:34.581636260Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 4.29527709s" Apr 24 23:37:34.582050 containerd[1519]: time="2026-04-24T23:37:34.581660374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Apr 24 23:37:34.586421 containerd[1519]: time="2026-04-24T23:37:34.584970247Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Apr 24 23:37:34.596667 containerd[1519]: time="2026-04-24T23:37:34.596623880Z" level=info msg="CreateContainer within sandbox \"723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Apr 24 23:37:34.608111 containerd[1519]: time="2026-04-24T23:37:34.608035720Z" level=info msg="CreateContainer within sandbox \"723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"288b854e5aba84cc71d85d96252105d669ecb68d4d2f151b6ac021b9e95eee4e\"" Apr 24 23:37:34.608782 containerd[1519]: time="2026-04-24T23:37:34.608662599Z" level=info msg="StartContainer for \"288b854e5aba84cc71d85d96252105d669ecb68d4d2f151b6ac021b9e95eee4e\"" Apr 24 23:37:34.641536 systemd[1]: Started cri-containerd-288b854e5aba84cc71d85d96252105d669ecb68d4d2f151b6ac021b9e95eee4e.scope - libcontainer container 288b854e5aba84cc71d85d96252105d669ecb68d4d2f151b6ac021b9e95eee4e. Apr 24 23:37:34.684395 containerd[1519]: time="2026-04-24T23:37:34.684346617Z" level=info msg="StartContainer for \"288b854e5aba84cc71d85d96252105d669ecb68d4d2f151b6ac021b9e95eee4e\" returns successfully" Apr 24 23:37:34.785250 kubelet[2565]: I0424 23:37:34.784998 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5c6df7864b-wqqbk" podStartSLOduration=30.305761687 podStartE2EDuration="36.784988859s" podCreationTimestamp="2026-04-24 23:36:58 +0000 UTC" firstStartedPulling="2026-04-24 23:37:28.103123127 +0000 UTC m=+46.715181921" lastFinishedPulling="2026-04-24 23:37:34.582350289 +0000 UTC m=+53.194409093" observedRunningTime="2026-04-24 23:37:34.783845888 +0000 UTC m=+53.395904682" watchObservedRunningTime="2026-04-24 23:37:34.784988859 +0000 UTC m=+53.397047653" Apr 24 23:37:36.341900 containerd[1519]: time="2026-04-24T23:37:36.341844784Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:36.342935 containerd[1519]: time="2026-04-24T23:37:36.342798977Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Apr 24 23:37:36.343810 containerd[1519]: time="2026-04-24T23:37:36.343775740Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:36.345654 containerd[1519]: time="2026-04-24T23:37:36.345624445Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:36.346065 containerd[1519]: time="2026-04-24T23:37:36.345963303Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 1.760970573s" Apr 24 23:37:36.346065 containerd[1519]: time="2026-04-24T23:37:36.345985815Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Apr 24 23:37:36.347660 containerd[1519]: time="2026-04-24T23:37:36.347643208Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 24 23:37:36.349492 containerd[1519]: time="2026-04-24T23:37:36.349465967Z" level=info msg="CreateContainer within sandbox \"4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Apr 24 23:37:36.363402 containerd[1519]: time="2026-04-24T23:37:36.363377044Z" level=info msg="CreateContainer within sandbox \"4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"71c694ee6aded1b34eb5dcf703ab1a99ea9955581e26e0bceb06c0fbec9659d4\"" Apr 24 23:37:36.363782 containerd[1519]: time="2026-04-24T23:37:36.363717493Z" level=info msg="StartContainer for \"71c694ee6aded1b34eb5dcf703ab1a99ea9955581e26e0bceb06c0fbec9659d4\"" Apr 24 23:37:36.389210 systemd[1]: run-containerd-runc-k8s.io-71c694ee6aded1b34eb5dcf703ab1a99ea9955581e26e0bceb06c0fbec9659d4-runc.nFoAmm.mount: Deactivated successfully. Apr 24 23:37:36.400537 systemd[1]: Started cri-containerd-71c694ee6aded1b34eb5dcf703ab1a99ea9955581e26e0bceb06c0fbec9659d4.scope - libcontainer container 71c694ee6aded1b34eb5dcf703ab1a99ea9955581e26e0bceb06c0fbec9659d4. Apr 24 23:37:36.423698 containerd[1519]: time="2026-04-24T23:37:36.423609734Z" level=info msg="StartContainer for \"71c694ee6aded1b34eb5dcf703ab1a99ea9955581e26e0bceb06c0fbec9659d4\" returns successfully" Apr 24 23:37:36.568958 kubelet[2565]: I0424 23:37:36.568879 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Apr 24 23:37:36.568958 kubelet[2565]: I0424 23:37:36.568933 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Apr 24 23:37:36.799866 kubelet[2565]: I0424 23:37:36.799132 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-kfkvn" podStartSLOduration=28.297378198 podStartE2EDuration="38.799110257s" podCreationTimestamp="2026-04-24 23:36:58 +0000 UTC" firstStartedPulling="2026-04-24 23:37:25.845050145 +0000 UTC m=+44.457108939" lastFinishedPulling="2026-04-24 23:37:36.346782194 +0000 UTC m=+54.958840998" observedRunningTime="2026-04-24 23:37:36.797214208 +0000 UTC m=+55.409273052" watchObservedRunningTime="2026-04-24 23:37:36.799110257 +0000 UTC m=+55.411169101" Apr 24 23:37:38.559060 containerd[1519]: time="2026-04-24T23:37:38.559025401Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:38.559896 containerd[1519]: time="2026-04-24T23:37:38.559810373Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Apr 24 23:37:38.560580 containerd[1519]: time="2026-04-24T23:37:38.560547452Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:38.562525 containerd[1519]: time="2026-04-24T23:37:38.562501567Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:38.563259 containerd[1519]: time="2026-04-24T23:37:38.563151204Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 2.215484965s" Apr 24 23:37:38.563259 containerd[1519]: time="2026-04-24T23:37:38.563175376Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Apr 24 23:37:38.564620 containerd[1519]: time="2026-04-24T23:37:38.564590136Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 24 23:37:38.566478 containerd[1519]: time="2026-04-24T23:37:38.566398572Z" level=info msg="CreateContainer within sandbox \"d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 24 23:37:38.576959 containerd[1519]: time="2026-04-24T23:37:38.576933150Z" level=info msg="CreateContainer within sandbox \"d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"33634252f0bec54c5af7871f9fa2f1714a2dcc26257215d22c0fdb822df04d06\"" Apr 24 23:37:38.577670 containerd[1519]: time="2026-04-24T23:37:38.577637803Z" level=info msg="StartContainer for \"33634252f0bec54c5af7871f9fa2f1714a2dcc26257215d22c0fdb822df04d06\"" Apr 24 23:37:38.608545 systemd[1]: Started cri-containerd-33634252f0bec54c5af7871f9fa2f1714a2dcc26257215d22c0fdb822df04d06.scope - libcontainer container 33634252f0bec54c5af7871f9fa2f1714a2dcc26257215d22c0fdb822df04d06. Apr 24 23:37:38.643653 containerd[1519]: time="2026-04-24T23:37:38.643596811Z" level=info msg="StartContainer for \"33634252f0bec54c5af7871f9fa2f1714a2dcc26257215d22c0fdb822df04d06\" returns successfully" Apr 24 23:37:39.022568 containerd[1519]: time="2026-04-24T23:37:39.021980517Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:37:39.023629 containerd[1519]: time="2026-04-24T23:37:39.023599965Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Apr 24 23:37:39.024842 containerd[1519]: time="2026-04-24T23:37:39.024803606Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 460.196072ms" Apr 24 23:37:39.024939 containerd[1519]: time="2026-04-24T23:37:39.024927182Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Apr 24 23:37:39.029805 containerd[1519]: time="2026-04-24T23:37:39.029789586Z" level=info msg="CreateContainer within sandbox \"239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 24 23:37:39.043072 containerd[1519]: time="2026-04-24T23:37:39.043050945Z" level=info msg="CreateContainer within sandbox \"239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"24b455d464d17ef7e977c2030e09368e6b783432e4a68f770570841576731944\"" Apr 24 23:37:39.044604 containerd[1519]: time="2026-04-24T23:37:39.044585144Z" level=info msg="StartContainer for \"24b455d464d17ef7e977c2030e09368e6b783432e4a68f770570841576731944\"" Apr 24 23:37:39.077541 systemd[1]: Started cri-containerd-24b455d464d17ef7e977c2030e09368e6b783432e4a68f770570841576731944.scope - libcontainer container 24b455d464d17ef7e977c2030e09368e6b783432e4a68f770570841576731944. Apr 24 23:37:39.117025 containerd[1519]: time="2026-04-24T23:37:39.116993838Z" level=info msg="StartContainer for \"24b455d464d17ef7e977c2030e09368e6b783432e4a68f770570841576731944\" returns successfully" Apr 24 23:37:39.625328 kubelet[2565]: I0424 23:37:39.625239 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-b8b79f554-9kfjb" podStartSLOduration=32.282617852 podStartE2EDuration="42.625224551s" podCreationTimestamp="2026-04-24 23:36:57 +0000 UTC" firstStartedPulling="2026-04-24 23:37:28.22139923 +0000 UTC m=+46.833458034" lastFinishedPulling="2026-04-24 23:37:38.564005929 +0000 UTC m=+57.176064733" observedRunningTime="2026-04-24 23:37:38.801106433 +0000 UTC m=+57.413165237" watchObservedRunningTime="2026-04-24 23:37:39.625224551 +0000 UTC m=+58.237283355" Apr 24 23:37:39.802215 kubelet[2565]: I0424 23:37:39.801771 2565 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-b8b79f554-vhwl6" podStartSLOduration=32.588984614 podStartE2EDuration="42.801761281s" podCreationTimestamp="2026-04-24 23:36:57 +0000 UTC" firstStartedPulling="2026-04-24 23:37:28.812846837 +0000 UTC m=+47.424905631" lastFinishedPulling="2026-04-24 23:37:39.025623494 +0000 UTC m=+57.637682298" observedRunningTime="2026-04-24 23:37:39.801597557 +0000 UTC m=+58.413656351" watchObservedRunningTime="2026-04-24 23:37:39.801761281 +0000 UTC m=+58.413820075" Apr 24 23:37:41.471610 containerd[1519]: time="2026-04-24T23:37:41.471490731Z" level=info msg="StopPodSandbox for \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\"" Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.536 [WARNING][5576] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0", GenerateName:"calico-kube-controllers-5c6df7864b-", Namespace:"calico-system", SelfLink:"", UID:"7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6", ResourceVersion:"1033", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5c6df7864b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586", Pod:"calico-kube-controllers-5c6df7864b-wqqbk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.35.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3d2bab2982d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.537 [INFO][5576] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.537 [INFO][5576] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" iface="eth0" netns="" Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.537 [INFO][5576] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.537 [INFO][5576] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.560 [INFO][5585] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" HandleID="k8s-pod-network.580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.560 [INFO][5585] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.560 [INFO][5585] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.565 [WARNING][5585] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" HandleID="k8s-pod-network.580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.566 [INFO][5585] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" HandleID="k8s-pod-network.580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.567 [INFO][5585] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:41.572350 containerd[1519]: 2026-04-24 23:37:41.570 [INFO][5576] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:41.572819 containerd[1519]: time="2026-04-24T23:37:41.572312542Z" level=info msg="TearDown network for sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\" successfully" Apr 24 23:37:41.572819 containerd[1519]: time="2026-04-24T23:37:41.572446316Z" level=info msg="StopPodSandbox for \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\" returns successfully" Apr 24 23:37:41.573170 containerd[1519]: time="2026-04-24T23:37:41.573144899Z" level=info msg="RemovePodSandbox for \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\"" Apr 24 23:37:41.573202 containerd[1519]: time="2026-04-24T23:37:41.573181804Z" level=info msg="Forcibly stopping sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\"" Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.604 [WARNING][5600] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0", GenerateName:"calico-kube-controllers-5c6df7864b-", Namespace:"calico-system", SelfLink:"", UID:"7a6dcb56-0ab0-4a8e-b283-aa8220fb3da6", ResourceVersion:"1033", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5c6df7864b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"723800165301cd2f6e76d6f6bd32bf5db518de1461fb343c46809649a645c586", Pod:"calico-kube-controllers-5c6df7864b-wqqbk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.35.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali3d2bab2982d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.604 [INFO][5600] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.604 [INFO][5600] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" iface="eth0" netns="" Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.604 [INFO][5600] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.604 [INFO][5600] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.622 [INFO][5608] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" HandleID="k8s-pod-network.580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.622 [INFO][5608] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.622 [INFO][5608] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.627 [WARNING][5608] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" HandleID="k8s-pod-network.580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.627 [INFO][5608] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" HandleID="k8s-pod-network.580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--kube--controllers--5c6df7864b--wqqbk-eth0" Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.629 [INFO][5608] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:41.632294 containerd[1519]: 2026-04-24 23:37:41.630 [INFO][5600] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e" Apr 24 23:37:41.632782 containerd[1519]: time="2026-04-24T23:37:41.632325645Z" level=info msg="TearDown network for sandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\" successfully" Apr 24 23:37:41.636595 containerd[1519]: time="2026-04-24T23:37:41.636555931Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:37:41.636657 containerd[1519]: time="2026-04-24T23:37:41.636610673Z" level=info msg="RemovePodSandbox \"580c99d506b6ae3bc232bba4e40ac5acbae980fdeae6181c18e333c2ea3ff51e\" returns successfully" Apr 24 23:37:41.637225 containerd[1519]: time="2026-04-24T23:37:41.637021509Z" level=info msg="StopPodSandbox for \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\"" Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.660 [WARNING][5622] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0", GenerateName:"calico-apiserver-b8b79f554-", Namespace:"calico-system", SelfLink:"", UID:"7bc41b28-ad74-4dc3-b94f-b24620d3a9ff", ResourceVersion:"1079", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b8b79f554", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e", Pod:"calico-apiserver-b8b79f554-vhwl6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali6740876887e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.661 [INFO][5622] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.661 [INFO][5622] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" iface="eth0" netns="" Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.661 [INFO][5622] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.661 [INFO][5622] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.675 [INFO][5629] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" HandleID="k8s-pod-network.436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.675 [INFO][5629] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.675 [INFO][5629] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.680 [WARNING][5629] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" HandleID="k8s-pod-network.436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.680 [INFO][5629] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" HandleID="k8s-pod-network.436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.681 [INFO][5629] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:41.685214 containerd[1519]: 2026-04-24 23:37:41.683 [INFO][5622] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:41.685641 containerd[1519]: time="2026-04-24T23:37:41.685248014Z" level=info msg="TearDown network for sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\" successfully" Apr 24 23:37:41.685641 containerd[1519]: time="2026-04-24T23:37:41.685274194Z" level=info msg="StopPodSandbox for \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\" returns successfully" Apr 24 23:37:41.685724 containerd[1519]: time="2026-04-24T23:37:41.685701598Z" level=info msg="RemovePodSandbox for \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\"" Apr 24 23:37:41.685747 containerd[1519]: time="2026-04-24T23:37:41.685727169Z" level=info msg="Forcibly stopping sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\"" Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.723 [WARNING][5643] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0", GenerateName:"calico-apiserver-b8b79f554-", Namespace:"calico-system", SelfLink:"", UID:"7bc41b28-ad74-4dc3-b94f-b24620d3a9ff", ResourceVersion:"1079", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b8b79f554", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"239aaa0ebe297dae632dcfa1e9fc6044076d42ea0b7eb1b966e95e775dea887e", Pod:"calico-apiserver-b8b79f554-vhwl6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali6740876887e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.723 [INFO][5643] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.723 [INFO][5643] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" iface="eth0" netns="" Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.724 [INFO][5643] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.724 [INFO][5643] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.743 [INFO][5651] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" HandleID="k8s-pod-network.436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.743 [INFO][5651] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.743 [INFO][5651] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.747 [WARNING][5651] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" HandleID="k8s-pod-network.436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.747 [INFO][5651] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" HandleID="k8s-pod-network.436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--vhwl6-eth0" Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.748 [INFO][5651] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:41.752382 containerd[1519]: 2026-04-24 23:37:41.750 [INFO][5643] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7" Apr 24 23:37:41.752382 containerd[1519]: time="2026-04-24T23:37:41.752355504Z" level=info msg="TearDown network for sandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\" successfully" Apr 24 23:37:41.755916 containerd[1519]: time="2026-04-24T23:37:41.755882604Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:37:41.755991 containerd[1519]: time="2026-04-24T23:37:41.755930594Z" level=info msg="RemovePodSandbox \"436f63d90010a4b07bfa47b4249043414bf5ab13c65f0dbb63f7a35e4c91a1a7\" returns successfully" Apr 24 23:37:41.756366 containerd[1519]: time="2026-04-24T23:37:41.756347412Z" level=info msg="StopPodSandbox for \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\"" Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.781 [WARNING][5666] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--747cf497b9--tkkg6-eth0" Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.781 [INFO][5666] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.781 [INFO][5666] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" iface="eth0" netns="" Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.781 [INFO][5666] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.781 [INFO][5666] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.798 [INFO][5673] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" HandleID="k8s-pod-network.c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--747cf497b9--tkkg6-eth0" Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.798 [INFO][5673] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.799 [INFO][5673] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.803 [WARNING][5673] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" HandleID="k8s-pod-network.c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--747cf497b9--tkkg6-eth0" Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.803 [INFO][5673] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" HandleID="k8s-pod-network.c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--747cf497b9--tkkg6-eth0" Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.804 [INFO][5673] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:41.807918 containerd[1519]: 2026-04-24 23:37:41.806 [INFO][5666] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:41.808355 containerd[1519]: time="2026-04-24T23:37:41.807997225Z" level=info msg="TearDown network for sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\" successfully" Apr 24 23:37:41.808355 containerd[1519]: time="2026-04-24T23:37:41.808015292Z" level=info msg="StopPodSandbox for \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\" returns successfully" Apr 24 23:37:41.808521 containerd[1519]: time="2026-04-24T23:37:41.808499149Z" level=info msg="RemovePodSandbox for \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\"" Apr 24 23:37:41.808521 containerd[1519]: time="2026-04-24T23:37:41.808519387Z" level=info msg="Forcibly stopping sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\"" Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.831 [WARNING][5687] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" WorkloadEndpoint="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--747cf497b9--tkkg6-eth0" Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.831 [INFO][5687] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.831 [INFO][5687] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" iface="eth0" netns="" Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.831 [INFO][5687] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.831 [INFO][5687] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.845 [INFO][5695] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" HandleID="k8s-pod-network.c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--747cf497b9--tkkg6-eth0" Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.846 [INFO][5695] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.846 [INFO][5695] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.850 [WARNING][5695] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" HandleID="k8s-pod-network.c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--747cf497b9--tkkg6-eth0" Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.850 [INFO][5695] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" HandleID="k8s-pod-network.c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-whisker--747cf497b9--tkkg6-eth0" Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.851 [INFO][5695] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:41.854631 containerd[1519]: 2026-04-24 23:37:41.852 [INFO][5687] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120" Apr 24 23:37:41.854631 containerd[1519]: time="2026-04-24T23:37:41.854599441Z" level=info msg="TearDown network for sandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\" successfully" Apr 24 23:37:41.858290 containerd[1519]: time="2026-04-24T23:37:41.858264127Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:37:41.858359 containerd[1519]: time="2026-04-24T23:37:41.858308765Z" level=info msg="RemovePodSandbox \"c22a8a4244023b68d3743ed8e5f6920d50390e91544dcef20cfa3f9f4f630120\" returns successfully" Apr 24 23:37:41.858946 containerd[1519]: time="2026-04-24T23:37:41.858743512Z" level=info msg="StopPodSandbox for \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\"" Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.885 [WARNING][5709] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"133141ef-2c84-4659-9f62-e85180cc6f03", ResourceVersion:"1017", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8", Pod:"goldmane-9f7667bb8-j6dz5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.35.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali90cc7c488ec", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.885 [INFO][5709] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.885 [INFO][5709] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" iface="eth0" netns="" Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.885 [INFO][5709] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.886 [INFO][5709] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.903 [INFO][5716] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" HandleID="k8s-pod-network.ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.903 [INFO][5716] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.903 [INFO][5716] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.907 [WARNING][5716] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" HandleID="k8s-pod-network.ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.907 [INFO][5716] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" HandleID="k8s-pod-network.ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.909 [INFO][5716] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:41.912918 containerd[1519]: 2026-04-24 23:37:41.911 [INFO][5709] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:41.913269 containerd[1519]: time="2026-04-24T23:37:41.912952432Z" level=info msg="TearDown network for sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\" successfully" Apr 24 23:37:41.913269 containerd[1519]: time="2026-04-24T23:37:41.912973240Z" level=info msg="StopPodSandbox for \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\" returns successfully" Apr 24 23:37:41.913731 containerd[1519]: time="2026-04-24T23:37:41.913488459Z" level=info msg="RemovePodSandbox for \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\"" Apr 24 23:37:41.913731 containerd[1519]: time="2026-04-24T23:37:41.913513689Z" level=info msg="Forcibly stopping sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\"" Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.940 [WARNING][5731] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"133141ef-2c84-4659-9f62-e85180cc6f03", ResourceVersion:"1017", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"53d875b5d4c1419ca5cd643dcf2b8a913dd08faec3d8e1bfc9352c1227b794a8", Pod:"goldmane-9f7667bb8-j6dz5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.35.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali90cc7c488ec", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.940 [INFO][5731] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.940 [INFO][5731] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" iface="eth0" netns="" Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.940 [INFO][5731] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.940 [INFO][5731] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.955 [INFO][5738] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" HandleID="k8s-pod-network.ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.955 [INFO][5738] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.955 [INFO][5738] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.960 [WARNING][5738] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" HandleID="k8s-pod-network.ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.960 [INFO][5738] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" HandleID="k8s-pod-network.ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-goldmane--9f7667bb8--j6dz5-eth0" Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.962 [INFO][5738] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:41.965433 containerd[1519]: 2026-04-24 23:37:41.963 [INFO][5731] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb" Apr 24 23:37:41.965780 containerd[1519]: time="2026-04-24T23:37:41.965462994Z" level=info msg="TearDown network for sandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\" successfully" Apr 24 23:37:41.973111 containerd[1519]: time="2026-04-24T23:37:41.973002831Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:37:41.973111 containerd[1519]: time="2026-04-24T23:37:41.973106323Z" level=info msg="RemovePodSandbox \"ca9a9a8d8075ef0186a0d1ee248aa46080085846c9740d194dc8b6d6023cfeeb\" returns successfully" Apr 24 23:37:41.973621 containerd[1519]: time="2026-04-24T23:37:41.973597021Z" level=info msg="StopPodSandbox for \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\"" Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.001 [WARNING][5753] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b8636bb0-afe6-4f7c-8c11-499ada35dc8f", ResourceVersion:"1047", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682", Pod:"csi-node-driver-kfkvn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.35.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4a65f30faf4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.002 [INFO][5753] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.002 [INFO][5753] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" iface="eth0" netns="" Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.002 [INFO][5753] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.002 [INFO][5753] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.017 [INFO][5761] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" HandleID="k8s-pod-network.b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.018 [INFO][5761] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.018 [INFO][5761] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.023 [WARNING][5761] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" HandleID="k8s-pod-network.b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.023 [INFO][5761] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" HandleID="k8s-pod-network.b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.024 [INFO][5761] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:42.029444 containerd[1519]: 2026-04-24 23:37:42.026 [INFO][5753] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:42.029444 containerd[1519]: time="2026-04-24T23:37:42.028045780Z" level=info msg="TearDown network for sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\" successfully" Apr 24 23:37:42.029444 containerd[1519]: time="2026-04-24T23:37:42.028067068Z" level=info msg="StopPodSandbox for \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\" returns successfully" Apr 24 23:37:42.029444 containerd[1519]: time="2026-04-24T23:37:42.028633026Z" level=info msg="RemovePodSandbox for \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\"" Apr 24 23:37:42.029444 containerd[1519]: time="2026-04-24T23:37:42.028653624Z" level=info msg="Forcibly stopping sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\"" Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.059 [WARNING][5775] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b8636bb0-afe6-4f7c-8c11-499ada35dc8f", ResourceVersion:"1047", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"4240fcaef762194107e0245631c641a0ff453eba7640d45f73bbcc2947358682", Pod:"csi-node-driver-kfkvn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.35.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4a65f30faf4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.059 [INFO][5775] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.059 [INFO][5775] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" iface="eth0" netns="" Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.059 [INFO][5775] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.059 [INFO][5775] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.074 [INFO][5782] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" HandleID="k8s-pod-network.b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.075 [INFO][5782] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.075 [INFO][5782] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.079 [WARNING][5782] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" HandleID="k8s-pod-network.b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.079 [INFO][5782] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" HandleID="k8s-pod-network.b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-csi--node--driver--kfkvn-eth0" Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.080 [INFO][5782] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:42.083828 containerd[1519]: 2026-04-24 23:37:42.082 [INFO][5775] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5" Apr 24 23:37:42.084156 containerd[1519]: time="2026-04-24T23:37:42.083847215Z" level=info msg="TearDown network for sandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\" successfully" Apr 24 23:37:42.087548 containerd[1519]: time="2026-04-24T23:37:42.087526874Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:37:42.087605 containerd[1519]: time="2026-04-24T23:37:42.087571011Z" level=info msg="RemovePodSandbox \"b5a84b94e2911505d7c2e27ffa2fcc1aad33a354d0eb8f5e6d248f50bed41bc5\" returns successfully" Apr 24 23:37:42.088176 containerd[1519]: time="2026-04-24T23:37:42.087960572Z" level=info msg="StopPodSandbox for \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\"" Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.111 [WARNING][5797] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"df938cae-ab41-409b-8afe-e6b3147b7b45", ResourceVersion:"997", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01", Pod:"coredns-7d764666f9-rvpdv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali84144e67349", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.111 [INFO][5797] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.111 [INFO][5797] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" iface="eth0" netns="" Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.111 [INFO][5797] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.111 [INFO][5797] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.129 [INFO][5805] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" HandleID="k8s-pod-network.fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.129 [INFO][5805] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.129 [INFO][5805] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.134 [WARNING][5805] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" HandleID="k8s-pod-network.fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.134 [INFO][5805] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" HandleID="k8s-pod-network.fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.135 [INFO][5805] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:42.138786 containerd[1519]: 2026-04-24 23:37:42.137 [INFO][5797] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:42.139258 containerd[1519]: time="2026-04-24T23:37:42.139225148Z" level=info msg="TearDown network for sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\" successfully" Apr 24 23:37:42.139258 containerd[1519]: time="2026-04-24T23:37:42.139252629Z" level=info msg="StopPodSandbox for \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\" returns successfully" Apr 24 23:37:42.139798 containerd[1519]: time="2026-04-24T23:37:42.139705794Z" level=info msg="RemovePodSandbox for \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\"" Apr 24 23:37:42.139798 containerd[1519]: time="2026-04-24T23:37:42.139727232Z" level=info msg="Forcibly stopping sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\"" Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.166 [WARNING][5820] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"df938cae-ab41-409b-8afe-e6b3147b7b45", ResourceVersion:"997", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"68112725d8ccc659c2a807c2feae3b6eb012640c8ef52b9dbd5a998af67ddb01", Pod:"coredns-7d764666f9-rvpdv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali84144e67349", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.166 [INFO][5820] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.166 [INFO][5820] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" iface="eth0" netns="" Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.166 [INFO][5820] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.166 [INFO][5820] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.182 [INFO][5828] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" HandleID="k8s-pod-network.fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.183 [INFO][5828] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.183 [INFO][5828] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.188 [WARNING][5828] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" HandleID="k8s-pod-network.fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.188 [INFO][5828] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" HandleID="k8s-pod-network.fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--rvpdv-eth0" Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.189 [INFO][5828] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:42.192971 containerd[1519]: 2026-04-24 23:37:42.191 [INFO][5820] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c" Apr 24 23:37:42.193387 containerd[1519]: time="2026-04-24T23:37:42.193355300Z" level=info msg="TearDown network for sandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\" successfully" Apr 24 23:37:42.196887 containerd[1519]: time="2026-04-24T23:37:42.196854779Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:37:42.196922 containerd[1519]: time="2026-04-24T23:37:42.196900757Z" level=info msg="RemovePodSandbox \"fa684ee016e7b8f4346eadb33b70fe00c046fd186c92942b1887a6a0d224c38c\" returns successfully" Apr 24 23:37:42.197401 containerd[1519]: time="2026-04-24T23:37:42.197371229Z" level=info msg="StopPodSandbox for \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\"" Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.222 [WARNING][5843] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"9e14ced5-8d3f-4116-8156-faf06229fa0d", ResourceVersion:"957", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079", Pod:"coredns-7d764666f9-dr55n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0e53319469", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.222 [INFO][5843] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.222 [INFO][5843] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" iface="eth0" netns="" Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.222 [INFO][5843] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.222 [INFO][5843] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.241 [INFO][5851] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" HandleID="k8s-pod-network.5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.241 [INFO][5851] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.241 [INFO][5851] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.246 [WARNING][5851] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" HandleID="k8s-pod-network.5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.246 [INFO][5851] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" HandleID="k8s-pod-network.5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.247 [INFO][5851] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:42.251219 containerd[1519]: 2026-04-24 23:37:42.249 [INFO][5843] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:42.251600 containerd[1519]: time="2026-04-24T23:37:42.251263919Z" level=info msg="TearDown network for sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\" successfully" Apr 24 23:37:42.251600 containerd[1519]: time="2026-04-24T23:37:42.251285308Z" level=info msg="StopPodSandbox for \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\" returns successfully" Apr 24 23:37:42.251696 containerd[1519]: time="2026-04-24T23:37:42.251674907Z" level=info msg="RemovePodSandbox for \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\"" Apr 24 23:37:42.251715 containerd[1519]: time="2026-04-24T23:37:42.251697106Z" level=info msg="Forcibly stopping sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\"" Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.278 [WARNING][5865] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"9e14ced5-8d3f-4116-8156-faf06229fa0d", ResourceVersion:"957", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"bb8279ef9f632eaf74b26a7aa1f295f21002bb30f9e96dec6b0418e27edaf079", Pod:"coredns-7d764666f9-dr55n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.35.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif0e53319469", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.278 [INFO][5865] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.278 [INFO][5865] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" iface="eth0" netns="" Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.278 [INFO][5865] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.278 [INFO][5865] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.292 [INFO][5873] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" HandleID="k8s-pod-network.5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.293 [INFO][5873] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.293 [INFO][5873] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.298 [WARNING][5873] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" HandleID="k8s-pod-network.5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.298 [INFO][5873] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" HandleID="k8s-pod-network.5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-coredns--7d764666f9--dr55n-eth0" Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.300 [INFO][5873] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:42.303628 containerd[1519]: 2026-04-24 23:37:42.301 [INFO][5865] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469" Apr 24 23:37:42.304610 containerd[1519]: time="2026-04-24T23:37:42.303698637Z" level=info msg="TearDown network for sandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\" successfully" Apr 24 23:37:42.307353 containerd[1519]: time="2026-04-24T23:37:42.307318172Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:37:42.307394 containerd[1519]: time="2026-04-24T23:37:42.307363560Z" level=info msg="RemovePodSandbox \"5f8e54c4b6682d32363b4f32066817a94de3ced48e2849c1757984fba84e1469\" returns successfully" Apr 24 23:37:42.307766 containerd[1519]: time="2026-04-24T23:37:42.307729242Z" level=info msg="StopPodSandbox for \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\"" Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.337 [WARNING][5888] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0", GenerateName:"calico-apiserver-b8b79f554-", Namespace:"calico-system", SelfLink:"", UID:"31b59c84-3007-40dc-82d8-2444bb185840", ResourceVersion:"1062", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b8b79f554", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4", Pod:"calico-apiserver-b8b79f554-9kfjb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib3bbacbc562", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.337 [INFO][5888] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.337 [INFO][5888] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" iface="eth0" netns="" Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.337 [INFO][5888] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.337 [INFO][5888] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.352 [INFO][5896] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" HandleID="k8s-pod-network.e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.352 [INFO][5896] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.352 [INFO][5896] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.356 [WARNING][5896] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" HandleID="k8s-pod-network.e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.356 [INFO][5896] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" HandleID="k8s-pod-network.e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.357 [INFO][5896] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:42.361736 containerd[1519]: 2026-04-24 23:37:42.359 [INFO][5888] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:42.362183 containerd[1519]: time="2026-04-24T23:37:42.361787245Z" level=info msg="TearDown network for sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\" successfully" Apr 24 23:37:42.362183 containerd[1519]: time="2026-04-24T23:37:42.361814276Z" level=info msg="StopPodSandbox for \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\" returns successfully" Apr 24 23:37:42.362347 containerd[1519]: time="2026-04-24T23:37:42.362305095Z" level=info msg="RemovePodSandbox for \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\"" Apr 24 23:37:42.362347 containerd[1519]: time="2026-04-24T23:37:42.362336867Z" level=info msg="Forcibly stopping sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\"" Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.387 [WARNING][5910] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0", GenerateName:"calico-apiserver-b8b79f554-", Namespace:"calico-system", SelfLink:"", UID:"31b59c84-3007-40dc-82d8-2444bb185840", ResourceVersion:"1062", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 36, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b8b79f554", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-6f01bfed3c", ContainerID:"d47185e88446e6d54550302d8610a3b3b106b6ac5353fad3cc18276ea9c746d4", Pod:"calico-apiserver-b8b79f554-9kfjb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.35.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib3bbacbc562", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.387 [INFO][5910] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.387 [INFO][5910] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" iface="eth0" netns="" Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.387 [INFO][5910] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.387 [INFO][5910] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.404 [INFO][5917] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" HandleID="k8s-pod-network.e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.404 [INFO][5917] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.404 [INFO][5917] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.408 [WARNING][5917] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" HandleID="k8s-pod-network.e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.408 [INFO][5917] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" HandleID="k8s-pod-network.e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Workload="ci--4081--3--6--n--6f01bfed3c-k8s-calico--apiserver--b8b79f554--9kfjb-eth0" Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.409 [INFO][5917] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:37:42.413353 containerd[1519]: 2026-04-24 23:37:42.411 [INFO][5910] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b" Apr 24 23:37:42.413688 containerd[1519]: time="2026-04-24T23:37:42.413384381Z" level=info msg="TearDown network for sandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\" successfully" Apr 24 23:37:42.416758 containerd[1519]: time="2026-04-24T23:37:42.416730310Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:37:42.416833 containerd[1519]: time="2026-04-24T23:37:42.416778158Z" level=info msg="RemovePodSandbox \"e04ce8dbea7e66e205736c48b37a09b9550778b585d23e4a8f841cc6c602744b\" returns successfully" Apr 24 23:38:18.734881 systemd[1]: Started sshd@7-65.21.181.31:22-4.175.71.9:53922.service - OpenSSH per-connection server daemon (4.175.71.9:53922). Apr 24 23:38:18.946475 sshd[6085]: Accepted publickey for core from 4.175.71.9 port 53922 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:18.948267 sshd[6085]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:18.952339 systemd-logind[1501]: New session 8 of user core. Apr 24 23:38:18.957141 systemd[1]: Started session-8.scope - Session 8 of User core. Apr 24 23:38:19.205070 sshd[6085]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:19.208852 systemd-logind[1501]: Session 8 logged out. Waiting for processes to exit. Apr 24 23:38:19.212846 systemd[1]: sshd@7-65.21.181.31:22-4.175.71.9:53922.service: Deactivated successfully. Apr 24 23:38:19.214230 systemd[1]: session-8.scope: Deactivated successfully. Apr 24 23:38:19.216159 systemd-logind[1501]: Removed session 8. Apr 24 23:38:24.254877 systemd[1]: Started sshd@8-65.21.181.31:22-4.175.71.9:53926.service - OpenSSH per-connection server daemon (4.175.71.9:53926). Apr 24 23:38:24.481967 sshd[6102]: Accepted publickey for core from 4.175.71.9 port 53926 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:24.485032 sshd[6102]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:24.493447 systemd-logind[1501]: New session 9 of user core. Apr 24 23:38:24.503661 systemd[1]: Started session-9.scope - Session 9 of User core. Apr 24 23:38:24.753920 sshd[6102]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:24.758342 systemd-logind[1501]: Session 9 logged out. Waiting for processes to exit. Apr 24 23:38:24.759171 systemd[1]: sshd@8-65.21.181.31:22-4.175.71.9:53926.service: Deactivated successfully. Apr 24 23:38:24.761559 systemd[1]: session-9.scope: Deactivated successfully. Apr 24 23:38:24.763505 systemd-logind[1501]: Removed session 9. Apr 24 23:38:29.798905 systemd[1]: Started sshd@9-65.21.181.31:22-4.175.71.9:36716.service - OpenSSH per-connection server daemon (4.175.71.9:36716). Apr 24 23:38:30.005458 sshd[6118]: Accepted publickey for core from 4.175.71.9 port 36716 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:30.007765 sshd[6118]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:30.014985 systemd-logind[1501]: New session 10 of user core. Apr 24 23:38:30.018743 systemd[1]: Started session-10.scope - Session 10 of User core. Apr 24 23:38:30.274340 sshd[6118]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:30.279054 systemd[1]: sshd@9-65.21.181.31:22-4.175.71.9:36716.service: Deactivated successfully. Apr 24 23:38:30.282975 systemd[1]: session-10.scope: Deactivated successfully. Apr 24 23:38:30.285036 systemd-logind[1501]: Session 10 logged out. Waiting for processes to exit. Apr 24 23:38:30.286629 systemd-logind[1501]: Removed session 10. Apr 24 23:38:35.323790 systemd[1]: Started sshd@10-65.21.181.31:22-4.175.71.9:53002.service - OpenSSH per-connection server daemon (4.175.71.9:53002). Apr 24 23:38:35.542884 sshd[6201]: Accepted publickey for core from 4.175.71.9 port 53002 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:35.545511 sshd[6201]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:35.550667 systemd-logind[1501]: New session 11 of user core. Apr 24 23:38:35.557635 systemd[1]: Started session-11.scope - Session 11 of User core. Apr 24 23:38:35.759023 sshd[6201]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:35.763300 systemd[1]: sshd@10-65.21.181.31:22-4.175.71.9:53002.service: Deactivated successfully. Apr 24 23:38:35.765171 systemd[1]: session-11.scope: Deactivated successfully. Apr 24 23:38:35.768142 systemd-logind[1501]: Session 11 logged out. Waiting for processes to exit. Apr 24 23:38:35.769560 systemd-logind[1501]: Removed session 11. Apr 24 23:38:35.799535 systemd[1]: Started sshd@11-65.21.181.31:22-4.175.71.9:53016.service - OpenSSH per-connection server daemon (4.175.71.9:53016). Apr 24 23:38:36.014095 sshd[6215]: Accepted publickey for core from 4.175.71.9 port 53016 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:36.015270 sshd[6215]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:36.019348 systemd-logind[1501]: New session 12 of user core. Apr 24 23:38:36.024540 systemd[1]: Started session-12.scope - Session 12 of User core. Apr 24 23:38:36.279508 sshd[6215]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:36.283916 systemd[1]: sshd@11-65.21.181.31:22-4.175.71.9:53016.service: Deactivated successfully. Apr 24 23:38:36.286725 systemd[1]: session-12.scope: Deactivated successfully. Apr 24 23:38:36.288956 systemd-logind[1501]: Session 12 logged out. Waiting for processes to exit. Apr 24 23:38:36.290535 systemd-logind[1501]: Removed session 12. Apr 24 23:38:36.317840 systemd[1]: Started sshd@12-65.21.181.31:22-4.175.71.9:53022.service - OpenSSH per-connection server daemon (4.175.71.9:53022). Apr 24 23:38:36.533489 sshd[6226]: Accepted publickey for core from 4.175.71.9 port 53022 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:36.536633 sshd[6226]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:36.544401 systemd-logind[1501]: New session 13 of user core. Apr 24 23:38:36.554576 systemd[1]: Started session-13.scope - Session 13 of User core. Apr 24 23:38:36.786043 sshd[6226]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:36.790125 systemd-logind[1501]: Session 13 logged out. Waiting for processes to exit. Apr 24 23:38:36.790653 systemd[1]: sshd@12-65.21.181.31:22-4.175.71.9:53022.service: Deactivated successfully. Apr 24 23:38:36.793877 systemd[1]: session-13.scope: Deactivated successfully. Apr 24 23:38:36.795017 systemd-logind[1501]: Removed session 13. Apr 24 23:38:41.840969 systemd[1]: Started sshd@13-65.21.181.31:22-4.175.71.9:53026.service - OpenSSH per-connection server daemon (4.175.71.9:53026). Apr 24 23:38:42.071913 sshd[6241]: Accepted publickey for core from 4.175.71.9 port 53026 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:42.074090 sshd[6241]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:42.080009 systemd-logind[1501]: New session 14 of user core. Apr 24 23:38:42.087819 systemd[1]: Started session-14.scope - Session 14 of User core. Apr 24 23:38:42.327097 sshd[6241]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:42.331172 systemd-logind[1501]: Session 14 logged out. Waiting for processes to exit. Apr 24 23:38:42.331586 systemd[1]: sshd@13-65.21.181.31:22-4.175.71.9:53026.service: Deactivated successfully. Apr 24 23:38:42.334319 systemd[1]: session-14.scope: Deactivated successfully. Apr 24 23:38:42.335180 systemd-logind[1501]: Removed session 14. Apr 24 23:38:47.368068 systemd[1]: Started sshd@14-65.21.181.31:22-4.175.71.9:34232.service - OpenSSH per-connection server daemon (4.175.71.9:34232). Apr 24 23:38:47.594311 sshd[6280]: Accepted publickey for core from 4.175.71.9 port 34232 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:47.596630 sshd[6280]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:47.602567 systemd-logind[1501]: New session 15 of user core. Apr 24 23:38:47.609668 systemd[1]: Started session-15.scope - Session 15 of User core. Apr 24 23:38:47.806849 sshd[6280]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:47.810861 systemd[1]: sshd@14-65.21.181.31:22-4.175.71.9:34232.service: Deactivated successfully. Apr 24 23:38:47.812888 systemd[1]: session-15.scope: Deactivated successfully. Apr 24 23:38:47.814147 systemd-logind[1501]: Session 15 logged out. Waiting for processes to exit. Apr 24 23:38:47.815261 systemd-logind[1501]: Removed session 15. Apr 24 23:38:47.857831 systemd[1]: Started sshd@15-65.21.181.31:22-4.175.71.9:34240.service - OpenSSH per-connection server daemon (4.175.71.9:34240). Apr 24 23:38:48.064088 sshd[6293]: Accepted publickey for core from 4.175.71.9 port 34240 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:48.066083 sshd[6293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:48.071517 systemd-logind[1501]: New session 16 of user core. Apr 24 23:38:48.081558 systemd[1]: Started session-16.scope - Session 16 of User core. Apr 24 23:38:48.518536 sshd[6293]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:48.521161 systemd[1]: sshd@15-65.21.181.31:22-4.175.71.9:34240.service: Deactivated successfully. Apr 24 23:38:48.524007 systemd[1]: session-16.scope: Deactivated successfully. Apr 24 23:38:48.525337 systemd-logind[1501]: Session 16 logged out. Waiting for processes to exit. Apr 24 23:38:48.526398 systemd-logind[1501]: Removed session 16. Apr 24 23:38:48.574771 systemd[1]: Started sshd@16-65.21.181.31:22-4.175.71.9:34252.service - OpenSSH per-connection server daemon (4.175.71.9:34252). Apr 24 23:38:48.796073 sshd[6335]: Accepted publickey for core from 4.175.71.9 port 34252 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:48.799214 sshd[6335]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:48.807886 systemd-logind[1501]: New session 17 of user core. Apr 24 23:38:48.813714 systemd[1]: Started session-17.scope - Session 17 of User core. Apr 24 23:38:49.420026 sshd[6335]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:49.426974 systemd[1]: sshd@16-65.21.181.31:22-4.175.71.9:34252.service: Deactivated successfully. Apr 24 23:38:49.431092 systemd[1]: session-17.scope: Deactivated successfully. Apr 24 23:38:49.432859 systemd-logind[1501]: Session 17 logged out. Waiting for processes to exit. Apr 24 23:38:49.433715 systemd-logind[1501]: Removed session 17. Apr 24 23:38:49.457145 systemd[1]: Started sshd@17-65.21.181.31:22-4.175.71.9:34266.service - OpenSSH per-connection server daemon (4.175.71.9:34266). Apr 24 23:38:49.664450 sshd[6368]: Accepted publickey for core from 4.175.71.9 port 34266 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:49.667468 sshd[6368]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:49.675728 systemd-logind[1501]: New session 18 of user core. Apr 24 23:38:49.682712 systemd[1]: Started session-18.scope - Session 18 of User core. Apr 24 23:38:49.999316 sshd[6368]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:50.002527 systemd[1]: sshd@17-65.21.181.31:22-4.175.71.9:34266.service: Deactivated successfully. Apr 24 23:38:50.004393 systemd[1]: session-18.scope: Deactivated successfully. Apr 24 23:38:50.005946 systemd-logind[1501]: Session 18 logged out. Waiting for processes to exit. Apr 24 23:38:50.006956 systemd-logind[1501]: Removed session 18. Apr 24 23:38:50.047618 systemd[1]: Started sshd@18-65.21.181.31:22-4.175.71.9:34280.service - OpenSSH per-connection server daemon (4.175.71.9:34280). Apr 24 23:38:50.250643 sshd[6379]: Accepted publickey for core from 4.175.71.9 port 34280 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:50.253968 sshd[6379]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:50.264909 systemd-logind[1501]: New session 19 of user core. Apr 24 23:38:50.272664 systemd[1]: Started session-19.scope - Session 19 of User core. Apr 24 23:38:50.487657 sshd[6379]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:50.493779 systemd-logind[1501]: Session 19 logged out. Waiting for processes to exit. Apr 24 23:38:50.494318 systemd[1]: sshd@18-65.21.181.31:22-4.175.71.9:34280.service: Deactivated successfully. Apr 24 23:38:50.497285 systemd[1]: session-19.scope: Deactivated successfully. Apr 24 23:38:50.502642 systemd-logind[1501]: Removed session 19. Apr 24 23:38:55.540525 systemd[1]: Started sshd@19-65.21.181.31:22-4.175.71.9:52146.service - OpenSSH per-connection server daemon (4.175.71.9:52146). Apr 24 23:38:55.749210 sshd[6397]: Accepted publickey for core from 4.175.71.9 port 52146 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:38:55.750833 sshd[6397]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:55.759003 systemd-logind[1501]: New session 20 of user core. Apr 24 23:38:55.762680 systemd[1]: Started session-20.scope - Session 20 of User core. Apr 24 23:38:55.984366 sshd[6397]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:55.988628 systemd-logind[1501]: Session 20 logged out. Waiting for processes to exit. Apr 24 23:38:55.989654 systemd[1]: sshd@19-65.21.181.31:22-4.175.71.9:52146.service: Deactivated successfully. Apr 24 23:38:55.991986 systemd[1]: session-20.scope: Deactivated successfully. Apr 24 23:38:55.992892 systemd-logind[1501]: Removed session 20. Apr 24 23:39:01.035894 systemd[1]: Started sshd@20-65.21.181.31:22-4.175.71.9:52158.service - OpenSSH per-connection server daemon (4.175.71.9:52158). Apr 24 23:39:01.261069 sshd[6410]: Accepted publickey for core from 4.175.71.9 port 52158 ssh2: RSA SHA256:/LB5UM8JE+Gm8PLCmanmk+IzzQFWk//dmRsy5hU4ZbM Apr 24 23:39:01.264803 sshd[6410]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:39:01.278117 systemd-logind[1501]: New session 21 of user core. Apr 24 23:39:01.283669 systemd[1]: Started session-21.scope - Session 21 of User core. Apr 24 23:39:01.490853 sshd[6410]: pam_unix(sshd:session): session closed for user core Apr 24 23:39:01.495759 systemd[1]: sshd@20-65.21.181.31:22-4.175.71.9:52158.service: Deactivated successfully. Apr 24 23:39:01.497775 systemd[1]: session-21.scope: Deactivated successfully. Apr 24 23:39:01.498747 systemd-logind[1501]: Session 21 logged out. Waiting for processes to exit. Apr 24 23:39:01.500257 systemd-logind[1501]: Removed session 21. Apr 24 23:39:09.154622 update_engine[1503]: I20260424 23:39:09.154523 1503 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Apr 24 23:39:09.154622 update_engine[1503]: I20260424 23:39:09.154610 1503 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Apr 24 23:39:09.155382 update_engine[1503]: I20260424 23:39:09.154992 1503 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Apr 24 23:39:09.156119 update_engine[1503]: I20260424 23:39:09.156062 1503 omaha_request_params.cc:62] Current group set to lts Apr 24 23:39:09.159826 update_engine[1503]: I20260424 23:39:09.159767 1503 update_attempter.cc:499] Already updated boot flags. Skipping. Apr 24 23:39:09.159826 update_engine[1503]: I20260424 23:39:09.159813 1503 update_attempter.cc:643] Scheduling an action processor start. Apr 24 23:39:09.159981 update_engine[1503]: I20260424 23:39:09.159847 1503 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Apr 24 23:39:09.159981 update_engine[1503]: I20260424 23:39:09.159904 1503 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Apr 24 23:39:09.160071 update_engine[1503]: I20260424 23:39:09.160053 1503 omaha_request_action.cc:271] Posting an Omaha request to disabled Apr 24 23:39:09.160126 update_engine[1503]: I20260424 23:39:09.160069 1503 omaha_request_action.cc:272] Request: Apr 24 23:39:09.160126 update_engine[1503]: Apr 24 23:39:09.160126 update_engine[1503]: Apr 24 23:39:09.160126 update_engine[1503]: Apr 24 23:39:09.160126 update_engine[1503]: Apr 24 23:39:09.160126 update_engine[1503]: Apr 24 23:39:09.160126 update_engine[1503]: Apr 24 23:39:09.160126 update_engine[1503]: Apr 24 23:39:09.160126 update_engine[1503]: Apr 24 23:39:09.160126 update_engine[1503]: I20260424 23:39:09.160085 1503 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 24 23:39:09.161223 locksmithd[1533]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Apr 24 23:39:09.167298 update_engine[1503]: I20260424 23:39:09.167219 1503 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 24 23:39:09.167908 update_engine[1503]: I20260424 23:39:09.167855 1503 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 24 23:39:09.169079 update_engine[1503]: E20260424 23:39:09.168910 1503 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 24 23:39:09.169079 update_engine[1503]: I20260424 23:39:09.169016 1503 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Apr 24 23:39:19.156335 update_engine[1503]: I20260424 23:39:19.156142 1503 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 24 23:39:19.157087 update_engine[1503]: I20260424 23:39:19.156605 1503 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 24 23:39:19.157087 update_engine[1503]: I20260424 23:39:19.156968 1503 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 24 23:39:19.157780 update_engine[1503]: E20260424 23:39:19.157722 1503 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 24 23:39:19.157864 update_engine[1503]: I20260424 23:39:19.157811 1503 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Apr 24 23:39:29.156598 update_engine[1503]: I20260424 23:39:29.156486 1503 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 24 23:39:29.157334 update_engine[1503]: I20260424 23:39:29.156898 1503 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 24 23:39:29.157334 update_engine[1503]: I20260424 23:39:29.157262 1503 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 24 23:39:29.157993 update_engine[1503]: E20260424 23:39:29.157933 1503 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 24 23:39:29.158108 update_engine[1503]: I20260424 23:39:29.158046 1503 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Apr 24 23:39:39.156260 update_engine[1503]: I20260424 23:39:39.156148 1503 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 24 23:39:39.156880 update_engine[1503]: I20260424 23:39:39.156652 1503 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 24 23:39:39.157077 update_engine[1503]: I20260424 23:39:39.157011 1503 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 24 23:39:39.157786 update_engine[1503]: E20260424 23:39:39.157733 1503 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 24 23:39:39.157845 update_engine[1503]: I20260424 23:39:39.157825 1503 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Apr 24 23:39:39.157875 update_engine[1503]: I20260424 23:39:39.157845 1503 omaha_request_action.cc:617] Omaha request response: Apr 24 23:39:39.158018 update_engine[1503]: E20260424 23:39:39.157982 1503 omaha_request_action.cc:636] Omaha request network transfer failed. Apr 24 23:39:39.158049 update_engine[1503]: I20260424 23:39:39.158022 1503 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Apr 24 23:39:39.158049 update_engine[1503]: I20260424 23:39:39.158038 1503 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Apr 24 23:39:39.158101 update_engine[1503]: I20260424 23:39:39.158053 1503 update_attempter.cc:306] Processing Done. Apr 24 23:39:39.158101 update_engine[1503]: E20260424 23:39:39.158079 1503 update_attempter.cc:619] Update failed. Apr 24 23:39:39.158150 update_engine[1503]: I20260424 23:39:39.158094 1503 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Apr 24 23:39:39.158150 update_engine[1503]: I20260424 23:39:39.158109 1503 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Apr 24 23:39:39.158150 update_engine[1503]: I20260424 23:39:39.158124 1503 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Apr 24 23:39:39.158266 update_engine[1503]: I20260424 23:39:39.158231 1503 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Apr 24 23:39:39.158297 update_engine[1503]: I20260424 23:39:39.158273 1503 omaha_request_action.cc:271] Posting an Omaha request to disabled Apr 24 23:39:39.158324 update_engine[1503]: I20260424 23:39:39.158288 1503 omaha_request_action.cc:272] Request: Apr 24 23:39:39.158324 update_engine[1503]: Apr 24 23:39:39.158324 update_engine[1503]: Apr 24 23:39:39.158324 update_engine[1503]: Apr 24 23:39:39.158324 update_engine[1503]: Apr 24 23:39:39.158324 update_engine[1503]: Apr 24 23:39:39.158324 update_engine[1503]: Apr 24 23:39:39.158324 update_engine[1503]: I20260424 23:39:39.158304 1503 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Apr 24 23:39:39.158699 update_engine[1503]: I20260424 23:39:39.158655 1503 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Apr 24 23:39:39.158985 update_engine[1503]: I20260424 23:39:39.158943 1503 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Apr 24 23:39:39.159490 locksmithd[1533]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Apr 24 23:39:39.159931 update_engine[1503]: E20260424 23:39:39.159858 1503 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Apr 24 23:39:39.159931 update_engine[1503]: I20260424 23:39:39.159945 1503 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Apr 24 23:39:39.159931 update_engine[1503]: I20260424 23:39:39.159958 1503 omaha_request_action.cc:617] Omaha request response: Apr 24 23:39:39.160193 update_engine[1503]: I20260424 23:39:39.159969 1503 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Apr 24 23:39:39.160193 update_engine[1503]: I20260424 23:39:39.159978 1503 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Apr 24 23:39:39.160193 update_engine[1503]: I20260424 23:39:39.159988 1503 update_attempter.cc:306] Processing Done. Apr 24 23:39:39.160193 update_engine[1503]: I20260424 23:39:39.159998 1503 update_attempter.cc:310] Error event sent. Apr 24 23:39:39.160193 update_engine[1503]: I20260424 23:39:39.160012 1503 update_check_scheduler.cc:74] Next update check in 46m12s Apr 24 23:39:39.160903 locksmithd[1533]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Apr 24 23:39:49.998245 kubelet[2565]: E0424 23:39:49.998164 2565 controller.go:251] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:40718->10.0.0.2:2379: read: connection timed out" Apr 24 23:39:50.071516 systemd[1]: cri-containerd-182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7.scope: Deactivated successfully. Apr 24 23:39:50.072791 systemd[1]: cri-containerd-182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7.scope: Consumed 8.040s CPU time. Apr 24 23:39:50.092709 containerd[1519]: time="2026-04-24T23:39:50.092647001Z" level=info msg="shim disconnected" id=182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7 namespace=k8s.io Apr 24 23:39:50.092709 containerd[1519]: time="2026-04-24T23:39:50.092702544Z" level=warning msg="cleaning up after shim disconnected" id=182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7 namespace=k8s.io Apr 24 23:39:50.092709 containerd[1519]: time="2026-04-24T23:39:50.092710374Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:39:50.095641 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7-rootfs.mount: Deactivated successfully. Apr 24 23:39:50.120380 kubelet[2565]: I0424 23:39:50.120356 2565 scope.go:122] "RemoveContainer" containerID="182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7" Apr 24 23:39:50.121885 containerd[1519]: time="2026-04-24T23:39:50.121737080Z" level=info msg="CreateContainer within sandbox \"0c87031f46f54f9786038c0b1e9492bf7e2ef5adf2e0d0fc0b80cea16e916248\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Apr 24 23:39:50.132078 containerd[1519]: time="2026-04-24T23:39:50.132047728Z" level=info msg="CreateContainer within sandbox \"0c87031f46f54f9786038c0b1e9492bf7e2ef5adf2e0d0fc0b80cea16e916248\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"6f08273bf5c9b2955298e70a57830bb6626babf12976de775ddda3730e86aea1\"" Apr 24 23:39:50.132889 containerd[1519]: time="2026-04-24T23:39:50.132356764Z" level=info msg="StartContainer for \"6f08273bf5c9b2955298e70a57830bb6626babf12976de775ddda3730e86aea1\"" Apr 24 23:39:50.161001 systemd[1]: Started cri-containerd-6f08273bf5c9b2955298e70a57830bb6626babf12976de775ddda3730e86aea1.scope - libcontainer container 6f08273bf5c9b2955298e70a57830bb6626babf12976de775ddda3730e86aea1. Apr 24 23:39:50.181284 containerd[1519]: time="2026-04-24T23:39:50.181252217Z" level=info msg="StartContainer for \"6f08273bf5c9b2955298e70a57830bb6626babf12976de775ddda3730e86aea1\" returns successfully" Apr 24 23:39:51.091692 systemd[1]: cri-containerd-64436aef0982207cfe22331c2503cd39b4888494dee70e51a4d86e51e767dbc6.scope: Deactivated successfully. Apr 24 23:39:51.091974 systemd[1]: cri-containerd-64436aef0982207cfe22331c2503cd39b4888494dee70e51a4d86e51e767dbc6.scope: Consumed 3.622s CPU time, 17.7M memory peak, 0B memory swap peak. Apr 24 23:39:51.130875 containerd[1519]: time="2026-04-24T23:39:51.130761828Z" level=info msg="shim disconnected" id=64436aef0982207cfe22331c2503cd39b4888494dee70e51a4d86e51e767dbc6 namespace=k8s.io Apr 24 23:39:51.130875 containerd[1519]: time="2026-04-24T23:39:51.130844322Z" level=warning msg="cleaning up after shim disconnected" id=64436aef0982207cfe22331c2503cd39b4888494dee70e51a4d86e51e767dbc6 namespace=k8s.io Apr 24 23:39:51.130875 containerd[1519]: time="2026-04-24T23:39:51.130854363Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:39:51.135523 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-64436aef0982207cfe22331c2503cd39b4888494dee70e51a4d86e51e767dbc6-rootfs.mount: Deactivated successfully. Apr 24 23:39:52.128423 kubelet[2565]: I0424 23:39:52.128377 2565 scope.go:122] "RemoveContainer" containerID="64436aef0982207cfe22331c2503cd39b4888494dee70e51a4d86e51e767dbc6" Apr 24 23:39:52.130390 containerd[1519]: time="2026-04-24T23:39:52.130113847Z" level=info msg="CreateContainer within sandbox \"12972b834c644ffc4dfeccdd68eb94f265be63636e5267742a80f25544890ee2\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Apr 24 23:39:52.144057 containerd[1519]: time="2026-04-24T23:39:52.144026216Z" level=info msg="CreateContainer within sandbox \"12972b834c644ffc4dfeccdd68eb94f265be63636e5267742a80f25544890ee2\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"522b4ec669a3be3438b74d932601d2d12c2efb66abe291e6ec5a901b62dca56e\"" Apr 24 23:39:52.144755 containerd[1519]: time="2026-04-24T23:39:52.144534353Z" level=info msg="StartContainer for \"522b4ec669a3be3438b74d932601d2d12c2efb66abe291e6ec5a901b62dca56e\"" Apr 24 23:39:52.171948 systemd[1]: run-containerd-runc-k8s.io-522b4ec669a3be3438b74d932601d2d12c2efb66abe291e6ec5a901b62dca56e-runc.zuCp4Q.mount: Deactivated successfully. Apr 24 23:39:52.182524 systemd[1]: Started cri-containerd-522b4ec669a3be3438b74d932601d2d12c2efb66abe291e6ec5a901b62dca56e.scope - libcontainer container 522b4ec669a3be3438b74d932601d2d12c2efb66abe291e6ec5a901b62dca56e. Apr 24 23:39:52.216355 containerd[1519]: time="2026-04-24T23:39:52.215804626Z" level=info msg="StartContainer for \"522b4ec669a3be3438b74d932601d2d12c2efb66abe291e6ec5a901b62dca56e\" returns successfully" Apr 24 23:39:54.579791 systemd[1]: cri-containerd-3e9b418cd350a1dcaa3972d87165dceb602b839139f4f5a650fd1a76d54fce4e.scope: Deactivated successfully. Apr 24 23:39:54.580659 systemd[1]: cri-containerd-3e9b418cd350a1dcaa3972d87165dceb602b839139f4f5a650fd1a76d54fce4e.scope: Consumed 1.832s CPU time, 14.9M memory peak, 0B memory swap peak. Apr 24 23:39:54.611734 containerd[1519]: time="2026-04-24T23:39:54.611625687Z" level=info msg="shim disconnected" id=3e9b418cd350a1dcaa3972d87165dceb602b839139f4f5a650fd1a76d54fce4e namespace=k8s.io Apr 24 23:39:54.611734 containerd[1519]: time="2026-04-24T23:39:54.611699321Z" level=warning msg="cleaning up after shim disconnected" id=3e9b418cd350a1dcaa3972d87165dceb602b839139f4f5a650fd1a76d54fce4e namespace=k8s.io Apr 24 23:39:54.611734 containerd[1519]: time="2026-04-24T23:39:54.611707021Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:39:54.613924 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3e9b418cd350a1dcaa3972d87165dceb602b839139f4f5a650fd1a76d54fce4e-rootfs.mount: Deactivated successfully. Apr 24 23:39:55.137070 kubelet[2565]: I0424 23:39:55.137015 2565 scope.go:122] "RemoveContainer" containerID="3e9b418cd350a1dcaa3972d87165dceb602b839139f4f5a650fd1a76d54fce4e" Apr 24 23:39:55.139692 containerd[1519]: time="2026-04-24T23:39:55.139565656Z" level=info msg="CreateContainer within sandbox \"29ee7268bed005196cc033ed56dec09f63c55d8e1ce8a38599084e6f85750a7a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Apr 24 23:39:55.155214 containerd[1519]: time="2026-04-24T23:39:55.155170313Z" level=info msg="CreateContainer within sandbox \"29ee7268bed005196cc033ed56dec09f63c55d8e1ce8a38599084e6f85750a7a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"2e504a4906d40dad7b160c71aa1fd1409acf5f2e1336ad38fc67409df92a26a2\"" Apr 24 23:39:55.155826 containerd[1519]: time="2026-04-24T23:39:55.155753005Z" level=info msg="StartContainer for \"2e504a4906d40dad7b160c71aa1fd1409acf5f2e1336ad38fc67409df92a26a2\"" Apr 24 23:39:55.189539 systemd[1]: Started cri-containerd-2e504a4906d40dad7b160c71aa1fd1409acf5f2e1336ad38fc67409df92a26a2.scope - libcontainer container 2e504a4906d40dad7b160c71aa1fd1409acf5f2e1336ad38fc67409df92a26a2. Apr 24 23:39:55.219640 containerd[1519]: time="2026-04-24T23:39:55.219430843Z" level=info msg="StartContainer for \"2e504a4906d40dad7b160c71aa1fd1409acf5f2e1336ad38fc67409df92a26a2\" returns successfully" Apr 24 23:39:58.421039 systemd[1]: cri-containerd-6f08273bf5c9b2955298e70a57830bb6626babf12976de775ddda3730e86aea1.scope: Deactivated successfully. Apr 24 23:39:58.443852 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6f08273bf5c9b2955298e70a57830bb6626babf12976de775ddda3730e86aea1-rootfs.mount: Deactivated successfully. Apr 24 23:39:58.447040 containerd[1519]: time="2026-04-24T23:39:58.446964856Z" level=info msg="shim disconnected" id=6f08273bf5c9b2955298e70a57830bb6626babf12976de775ddda3730e86aea1 namespace=k8s.io Apr 24 23:39:58.447040 containerd[1519]: time="2026-04-24T23:39:58.447018519Z" level=warning msg="cleaning up after shim disconnected" id=6f08273bf5c9b2955298e70a57830bb6626babf12976de775ddda3730e86aea1 namespace=k8s.io Apr 24 23:39:58.447040 containerd[1519]: time="2026-04-24T23:39:58.447027630Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:39:58.459951 containerd[1519]: time="2026-04-24T23:39:58.459907094Z" level=warning msg="cleanup warnings time=\"2026-04-24T23:39:58Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Apr 24 23:39:59.154389 kubelet[2565]: I0424 23:39:59.153617 2565 scope.go:122] "RemoveContainer" containerID="182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7" Apr 24 23:39:59.154389 kubelet[2565]: I0424 23:39:59.154095 2565 scope.go:122] "RemoveContainer" containerID="6f08273bf5c9b2955298e70a57830bb6626babf12976de775ddda3730e86aea1" Apr 24 23:39:59.156927 containerd[1519]: time="2026-04-24T23:39:59.156526246Z" level=info msg="RemoveContainer for \"182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7\"" Apr 24 23:39:59.157043 kubelet[2565]: E0424 23:39:59.156920 2565 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-6cf4cccc57-2gkdq_tigera-operator(614611a9-22cd-424e-999a-1cb3e3f2aa5f)\"" pod="tigera-operator/tigera-operator-6cf4cccc57-2gkdq" podUID="614611a9-22cd-424e-999a-1cb3e3f2aa5f" Apr 24 23:39:59.162276 containerd[1519]: time="2026-04-24T23:39:59.162157230Z" level=info msg="RemoveContainer for \"182efefe1d594187d82e88a3b2a8af6b9d13f97480ae0696c2569614f57738e7\" returns successfully" Apr 24 23:39:59.999636 kubelet[2565]: E0424 23:39:59.999596 2565 controller.go:251] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io ci-4081-3-6-n-6f01bfed3c)" Apr 24 23:40:04.815043 systemd[1]: run-containerd-runc-k8s.io-288b854e5aba84cc71d85d96252105d669ecb68d4d2f151b6ac021b9e95eee4e-runc.jFAy4e.mount: Deactivated successfully.