Apr 24 23:34:12.900672 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Apr 24 23:34:12.900702 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Apr 24 22:19:35 -00 2026 Apr 24 23:34:12.900712 kernel: KASLR enabled Apr 24 23:34:12.900718 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Apr 24 23:34:12.900724 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x138595418 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b43d18 Apr 24 23:34:12.900730 kernel: random: crng init done Apr 24 23:34:12.900737 kernel: ACPI: Early table checksum verification disabled Apr 24 23:34:12.900742 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Apr 24 23:34:12.900749 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Apr 24 23:34:12.900756 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:34:12.900763 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:34:12.900769 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:34:12.900774 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:34:12.900780 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:34:12.900788 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:34:12.900795 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:34:12.900802 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:34:12.900808 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Apr 24 23:34:12.900815 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Apr 24 23:34:12.900821 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Apr 24 23:34:12.900827 kernel: NUMA: Failed to initialise from firmware Apr 24 23:34:12.900834 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Apr 24 23:34:12.900840 kernel: NUMA: NODE_DATA [mem 0x139671800-0x139676fff] Apr 24 23:34:12.900846 kernel: Zone ranges: Apr 24 23:34:12.900852 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Apr 24 23:34:12.900860 kernel: DMA32 empty Apr 24 23:34:12.900866 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Apr 24 23:34:12.900873 kernel: Movable zone start for each node Apr 24 23:34:12.900879 kernel: Early memory node ranges Apr 24 23:34:12.900886 kernel: node 0: [mem 0x0000000040000000-0x000000013676ffff] Apr 24 23:34:12.900892 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Apr 24 23:34:12.900899 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Apr 24 23:34:12.900917 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Apr 24 23:34:12.900924 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Apr 24 23:34:12.900930 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Apr 24 23:34:12.900937 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Apr 24 23:34:12.900943 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Apr 24 23:34:12.900952 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Apr 24 23:34:12.900958 kernel: psci: probing for conduit method from ACPI. Apr 24 23:34:12.900965 kernel: psci: PSCIv1.1 detected in firmware. Apr 24 23:34:12.900974 kernel: psci: Using standard PSCI v0.2 function IDs Apr 24 23:34:12.900980 kernel: psci: Trusted OS migration not required Apr 24 23:34:12.900987 kernel: psci: SMC Calling Convention v1.1 Apr 24 23:34:12.900995 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Apr 24 23:34:12.901002 kernel: percpu: Embedded 30 pages/cpu s85736 r8192 d28952 u122880 Apr 24 23:34:12.901008 kernel: pcpu-alloc: s85736 r8192 d28952 u122880 alloc=30*4096 Apr 24 23:34:12.901015 kernel: pcpu-alloc: [0] 0 [0] 1 Apr 24 23:34:12.901022 kernel: Detected PIPT I-cache on CPU0 Apr 24 23:34:12.901029 kernel: CPU features: detected: GIC system register CPU interface Apr 24 23:34:12.901036 kernel: CPU features: detected: Hardware dirty bit management Apr 24 23:34:12.901042 kernel: CPU features: detected: Spectre-v4 Apr 24 23:34:12.901049 kernel: CPU features: detected: Spectre-BHB Apr 24 23:34:12.901055 kernel: CPU features: kernel page table isolation forced ON by KASLR Apr 24 23:34:12.901064 kernel: CPU features: detected: Kernel page table isolation (KPTI) Apr 24 23:34:12.901071 kernel: CPU features: detected: ARM erratum 1418040 Apr 24 23:34:12.901077 kernel: CPU features: detected: SSBS not fully self-synchronizing Apr 24 23:34:12.901084 kernel: alternatives: applying boot alternatives Apr 24 23:34:12.901091 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=63304dd98a277d4592d17e0085ae3f91ca70cc8ec6dedfdd357a1e9755f9a8b3 Apr 24 23:34:12.901099 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Apr 24 23:34:12.901105 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Apr 24 23:34:12.901112 kernel: Fallback order for Node 0: 0 Apr 24 23:34:12.901133 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Apr 24 23:34:12.901140 kernel: Policy zone: Normal Apr 24 23:34:12.901147 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Apr 24 23:34:12.901156 kernel: software IO TLB: area num 2. Apr 24 23:34:12.901163 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Apr 24 23:34:12.901170 kernel: Memory: 3882824K/4096000K available (10304K kernel code, 2180K rwdata, 8116K rodata, 39424K init, 897K bss, 213176K reserved, 0K cma-reserved) Apr 24 23:34:12.901177 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Apr 24 23:34:12.901183 kernel: rcu: Preemptible hierarchical RCU implementation. Apr 24 23:34:12.901191 kernel: rcu: RCU event tracing is enabled. Apr 24 23:34:12.901198 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Apr 24 23:34:12.901205 kernel: Trampoline variant of Tasks RCU enabled. Apr 24 23:34:12.901212 kernel: Tracing variant of Tasks RCU enabled. Apr 24 23:34:12.901219 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Apr 24 23:34:12.901226 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Apr 24 23:34:12.901233 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Apr 24 23:34:12.901242 kernel: GICv3: 256 SPIs implemented Apr 24 23:34:12.901248 kernel: GICv3: 0 Extended SPIs implemented Apr 24 23:34:12.901255 kernel: Root IRQ handler: gic_handle_irq Apr 24 23:34:12.901261 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Apr 24 23:34:12.901268 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Apr 24 23:34:12.901275 kernel: ITS [mem 0x08080000-0x0809ffff] Apr 24 23:34:12.901282 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Apr 24 23:34:12.901289 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Apr 24 23:34:12.901296 kernel: GICv3: using LPI property table @0x00000001000e0000 Apr 24 23:34:12.901303 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Apr 24 23:34:12.901309 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Apr 24 23:34:12.901317 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Apr 24 23:34:12.901324 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Apr 24 23:34:12.901331 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Apr 24 23:34:12.901338 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Apr 24 23:34:12.901345 kernel: Console: colour dummy device 80x25 Apr 24 23:34:12.901352 kernel: ACPI: Core revision 20230628 Apr 24 23:34:12.901360 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Apr 24 23:34:12.901367 kernel: pid_max: default: 32768 minimum: 301 Apr 24 23:34:12.901374 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Apr 24 23:34:12.901381 kernel: landlock: Up and running. Apr 24 23:34:12.901389 kernel: SELinux: Initializing. Apr 24 23:34:12.901397 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Apr 24 23:34:12.901404 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Apr 24 23:34:12.901411 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 24 23:34:12.901418 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Apr 24 23:34:12.901425 kernel: rcu: Hierarchical SRCU implementation. Apr 24 23:34:12.901432 kernel: rcu: Max phase no-delay instances is 400. Apr 24 23:34:12.901439 kernel: Platform MSI: ITS@0x8080000 domain created Apr 24 23:34:12.901446 kernel: PCI/MSI: ITS@0x8080000 domain created Apr 24 23:34:12.901454 kernel: Remapping and enabling EFI services. Apr 24 23:34:12.901461 kernel: smp: Bringing up secondary CPUs ... Apr 24 23:34:12.901468 kernel: Detected PIPT I-cache on CPU1 Apr 24 23:34:12.901475 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Apr 24 23:34:12.901482 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Apr 24 23:34:12.901489 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Apr 24 23:34:12.901496 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Apr 24 23:34:12.901503 kernel: smp: Brought up 1 node, 2 CPUs Apr 24 23:34:12.901510 kernel: SMP: Total of 2 processors activated. Apr 24 23:34:12.901519 kernel: CPU features: detected: 32-bit EL0 Support Apr 24 23:34:12.901526 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Apr 24 23:34:12.901533 kernel: CPU features: detected: Common not Private translations Apr 24 23:34:12.901546 kernel: CPU features: detected: CRC32 instructions Apr 24 23:34:12.901554 kernel: CPU features: detected: Enhanced Virtualization Traps Apr 24 23:34:12.901562 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Apr 24 23:34:12.901569 kernel: CPU features: detected: LSE atomic instructions Apr 24 23:34:12.901577 kernel: CPU features: detected: Privileged Access Never Apr 24 23:34:12.901585 kernel: CPU features: detected: RAS Extension Support Apr 24 23:34:12.901594 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Apr 24 23:34:12.901601 kernel: CPU: All CPU(s) started at EL1 Apr 24 23:34:12.901608 kernel: alternatives: applying system-wide alternatives Apr 24 23:34:12.901615 kernel: devtmpfs: initialized Apr 24 23:34:12.901623 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Apr 24 23:34:12.901631 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Apr 24 23:34:12.901638 kernel: pinctrl core: initialized pinctrl subsystem Apr 24 23:34:12.901645 kernel: SMBIOS 3.0.0 present. Apr 24 23:34:12.901654 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Apr 24 23:34:12.901662 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Apr 24 23:34:12.901670 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Apr 24 23:34:12.901677 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Apr 24 23:34:12.901685 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Apr 24 23:34:12.901692 kernel: audit: initializing netlink subsys (disabled) Apr 24 23:34:12.901699 kernel: audit: type=2000 audit(0.012:1): state=initialized audit_enabled=0 res=1 Apr 24 23:34:12.901707 kernel: thermal_sys: Registered thermal governor 'step_wise' Apr 24 23:34:12.901714 kernel: cpuidle: using governor menu Apr 24 23:34:12.901723 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Apr 24 23:34:12.901730 kernel: ASID allocator initialised with 32768 entries Apr 24 23:34:12.901738 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Apr 24 23:34:12.901745 kernel: Serial: AMBA PL011 UART driver Apr 24 23:34:12.901753 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Apr 24 23:34:12.901760 kernel: Modules: 0 pages in range for non-PLT usage Apr 24 23:34:12.901767 kernel: Modules: 509008 pages in range for PLT usage Apr 24 23:34:12.901775 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Apr 24 23:34:12.901782 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Apr 24 23:34:12.901792 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Apr 24 23:34:12.901799 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Apr 24 23:34:12.901806 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Apr 24 23:34:12.901813 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Apr 24 23:34:12.901821 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Apr 24 23:34:12.901828 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Apr 24 23:34:12.901836 kernel: ACPI: Added _OSI(Module Device) Apr 24 23:34:12.901843 kernel: ACPI: Added _OSI(Processor Device) Apr 24 23:34:12.901850 kernel: ACPI: Added _OSI(Processor Aggregator Device) Apr 24 23:34:12.901859 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Apr 24 23:34:12.901867 kernel: ACPI: Interpreter enabled Apr 24 23:34:12.901874 kernel: ACPI: Using GIC for interrupt routing Apr 24 23:34:12.901881 kernel: ACPI: MCFG table detected, 1 entries Apr 24 23:34:12.901889 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Apr 24 23:34:12.901896 kernel: printk: console [ttyAMA0] enabled Apr 24 23:34:12.901932 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Apr 24 23:34:12.902157 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Apr 24 23:34:12.902254 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Apr 24 23:34:12.902323 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Apr 24 23:34:12.902391 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Apr 24 23:34:12.902458 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Apr 24 23:34:12.902471 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Apr 24 23:34:12.902479 kernel: PCI host bridge to bus 0000:00 Apr 24 23:34:12.902562 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Apr 24 23:34:12.902629 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Apr 24 23:34:12.904293 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Apr 24 23:34:12.904394 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Apr 24 23:34:12.904494 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Apr 24 23:34:12.904587 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Apr 24 23:34:12.904660 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Apr 24 23:34:12.904741 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Apr 24 23:34:12.904825 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Apr 24 23:34:12.904929 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Apr 24 23:34:12.905026 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Apr 24 23:34:12.905104 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Apr 24 23:34:12.905372 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Apr 24 23:34:12.905453 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Apr 24 23:34:12.905547 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Apr 24 23:34:12.905623 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Apr 24 23:34:12.906288 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Apr 24 23:34:12.906374 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Apr 24 23:34:12.906448 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Apr 24 23:34:12.906514 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Apr 24 23:34:12.906596 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Apr 24 23:34:12.906662 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Apr 24 23:34:12.906734 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Apr 24 23:34:12.906805 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Apr 24 23:34:12.906878 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Apr 24 23:34:12.906993 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Apr 24 23:34:12.907090 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Apr 24 23:34:12.907185 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] Apr 24 23:34:12.907278 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Apr 24 23:34:12.907362 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Apr 24 23:34:12.907445 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Apr 24 23:34:12.907518 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Apr 24 23:34:12.907600 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Apr 24 23:34:12.907690 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Apr 24 23:34:12.907774 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Apr 24 23:34:12.907852 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Apr 24 23:34:12.907937 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Apr 24 23:34:12.908023 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Apr 24 23:34:12.908094 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Apr 24 23:34:12.910762 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Apr 24 23:34:12.910859 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x10800000-0x10800fff] Apr 24 23:34:12.910952 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Apr 24 23:34:12.911043 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Apr 24 23:34:12.911137 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Apr 24 23:34:12.911214 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Apr 24 23:34:12.911307 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Apr 24 23:34:12.911378 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Apr 24 23:34:12.911446 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Apr 24 23:34:12.911513 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Apr 24 23:34:12.911585 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Apr 24 23:34:12.911651 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Apr 24 23:34:12.911719 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Apr 24 23:34:12.911795 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Apr 24 23:34:12.911862 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Apr 24 23:34:12.911970 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Apr 24 23:34:12.912049 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Apr 24 23:34:12.912136 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Apr 24 23:34:12.912209 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Apr 24 23:34:12.912282 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Apr 24 23:34:12.912350 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Apr 24 23:34:12.912422 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Apr 24 23:34:12.912495 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Apr 24 23:34:12.912561 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Apr 24 23:34:12.912625 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Apr 24 23:34:12.912695 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Apr 24 23:34:12.912760 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Apr 24 23:34:12.912826 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Apr 24 23:34:12.912899 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Apr 24 23:34:12.912986 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Apr 24 23:34:12.913054 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Apr 24 23:34:12.913136 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Apr 24 23:34:12.913207 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Apr 24 23:34:12.913273 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Apr 24 23:34:12.913345 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Apr 24 23:34:12.913417 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Apr 24 23:34:12.913482 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Apr 24 23:34:12.913550 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Apr 24 23:34:12.913616 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Apr 24 23:34:12.913685 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Apr 24 23:34:12.913751 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Apr 24 23:34:12.913818 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Apr 24 23:34:12.913887 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Apr 24 23:34:12.913999 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Apr 24 23:34:12.914071 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Apr 24 23:34:12.914151 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Apr 24 23:34:12.914220 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Apr 24 23:34:12.914287 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Apr 24 23:34:12.914354 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Apr 24 23:34:12.914426 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Apr 24 23:34:12.914492 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Apr 24 23:34:12.914558 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Apr 24 23:34:12.914622 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Apr 24 23:34:12.914688 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Apr 24 23:34:12.914754 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Apr 24 23:34:12.914825 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Apr 24 23:34:12.914894 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Apr 24 23:34:12.914976 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Apr 24 23:34:12.915046 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Apr 24 23:34:12.915147 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Apr 24 23:34:12.915223 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Apr 24 23:34:12.915293 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Apr 24 23:34:12.915360 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Apr 24 23:34:12.915432 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Apr 24 23:34:12.915504 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Apr 24 23:34:12.915572 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Apr 24 23:34:12.915637 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Apr 24 23:34:12.915707 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Apr 24 23:34:12.915774 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Apr 24 23:34:12.915842 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Apr 24 23:34:12.915917 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Apr 24 23:34:12.915990 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Apr 24 23:34:12.916061 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Apr 24 23:34:12.916208 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Apr 24 23:34:12.916281 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Apr 24 23:34:12.916350 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Apr 24 23:34:12.916422 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Apr 24 23:34:12.916488 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Apr 24 23:34:12.916553 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Apr 24 23:34:12.916621 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Apr 24 23:34:12.916690 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Apr 24 23:34:12.916754 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Apr 24 23:34:12.916816 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Apr 24 23:34:12.916886 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Apr 24 23:34:12.916998 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Apr 24 23:34:12.917066 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Apr 24 23:34:12.919246 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Apr 24 23:34:12.919359 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Apr 24 23:34:12.919444 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Apr 24 23:34:12.919514 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Apr 24 23:34:12.919582 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Apr 24 23:34:12.919648 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Apr 24 23:34:12.919724 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Apr 24 23:34:12.919790 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Apr 24 23:34:12.919868 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Apr 24 23:34:12.919983 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Apr 24 23:34:12.920058 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Apr 24 23:34:12.920197 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Apr 24 23:34:12.920271 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Apr 24 23:34:12.920345 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Apr 24 23:34:12.920421 kernel: pci 0000:05:00.0: BAR 1: assigned [mem 0x10800000-0x10800fff] Apr 24 23:34:12.920490 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Apr 24 23:34:12.920555 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Apr 24 23:34:12.920626 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Apr 24 23:34:12.920697 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Apr 24 23:34:12.920785 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Apr 24 23:34:12.920865 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Apr 24 23:34:12.920967 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Apr 24 23:34:12.921045 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Apr 24 23:34:12.921122 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Apr 24 23:34:12.921193 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Apr 24 23:34:12.921268 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Apr 24 23:34:12.921346 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Apr 24 23:34:12.921433 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Apr 24 23:34:12.921522 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Apr 24 23:34:12.921589 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Apr 24 23:34:12.922303 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Apr 24 23:34:12.922406 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Apr 24 23:34:12.922483 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Apr 24 23:34:12.922554 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Apr 24 23:34:12.922625 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Apr 24 23:34:12.922694 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Apr 24 23:34:12.922768 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Apr 24 23:34:12.922837 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Apr 24 23:34:12.922936 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Apr 24 23:34:12.923019 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Apr 24 23:34:12.923094 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Apr 24 23:34:12.923182 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Apr 24 23:34:12.923246 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Apr 24 23:34:12.923331 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Apr 24 23:34:12.923398 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Apr 24 23:34:12.923470 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Apr 24 23:34:12.923571 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Apr 24 23:34:12.923662 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Apr 24 23:34:12.923729 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Apr 24 23:34:12.923899 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Apr 24 23:34:12.924021 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Apr 24 23:34:12.924097 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Apr 24 23:34:12.924207 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Apr 24 23:34:12.924275 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Apr 24 23:34:12.924353 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Apr 24 23:34:12.924423 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Apr 24 23:34:12.924484 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Apr 24 23:34:12.924544 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Apr 24 23:34:12.924616 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Apr 24 23:34:12.924676 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Apr 24 23:34:12.924739 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Apr 24 23:34:12.924807 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Apr 24 23:34:12.924872 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Apr 24 23:34:12.924979 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Apr 24 23:34:12.925059 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Apr 24 23:34:12.925137 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Apr 24 23:34:12.925209 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Apr 24 23:34:12.925294 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Apr 24 23:34:12.925383 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Apr 24 23:34:12.925457 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Apr 24 23:34:12.925468 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Apr 24 23:34:12.925477 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Apr 24 23:34:12.925485 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Apr 24 23:34:12.925493 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Apr 24 23:34:12.925501 kernel: iommu: Default domain type: Translated Apr 24 23:34:12.925509 kernel: iommu: DMA domain TLB invalidation policy: strict mode Apr 24 23:34:12.925517 kernel: efivars: Registered efivars operations Apr 24 23:34:12.925527 kernel: vgaarb: loaded Apr 24 23:34:12.925535 kernel: clocksource: Switched to clocksource arch_sys_counter Apr 24 23:34:12.925542 kernel: VFS: Disk quotas dquot_6.6.0 Apr 24 23:34:12.925550 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Apr 24 23:34:12.925558 kernel: pnp: PnP ACPI init Apr 24 23:34:12.925637 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Apr 24 23:34:12.925650 kernel: pnp: PnP ACPI: found 1 devices Apr 24 23:34:12.925658 kernel: NET: Registered PF_INET protocol family Apr 24 23:34:12.925666 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Apr 24 23:34:12.925676 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Apr 24 23:34:12.925684 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Apr 24 23:34:12.925693 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Apr 24 23:34:12.925701 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Apr 24 23:34:12.925708 kernel: TCP: Hash tables configured (established 32768 bind 32768) Apr 24 23:34:12.925716 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Apr 24 23:34:12.925725 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Apr 24 23:34:12.925732 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Apr 24 23:34:12.925811 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Apr 24 23:34:12.925825 kernel: PCI: CLS 0 bytes, default 64 Apr 24 23:34:12.925833 kernel: kvm [1]: HYP mode not available Apr 24 23:34:12.925841 kernel: Initialise system trusted keyrings Apr 24 23:34:12.925849 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Apr 24 23:34:12.925857 kernel: Key type asymmetric registered Apr 24 23:34:12.925865 kernel: Asymmetric key parser 'x509' registered Apr 24 23:34:12.925873 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Apr 24 23:34:12.925881 kernel: io scheduler mq-deadline registered Apr 24 23:34:12.925888 kernel: io scheduler kyber registered Apr 24 23:34:12.925898 kernel: io scheduler bfq registered Apr 24 23:34:12.925916 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Apr 24 23:34:12.925998 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Apr 24 23:34:12.926068 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Apr 24 23:34:12.926208 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 24 23:34:12.926283 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Apr 24 23:34:12.926350 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Apr 24 23:34:12.926420 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 24 23:34:12.926490 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Apr 24 23:34:12.926555 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Apr 24 23:34:12.926620 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 24 23:34:12.926689 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Apr 24 23:34:12.926757 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Apr 24 23:34:12.926822 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 24 23:34:12.926893 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Apr 24 23:34:12.926977 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Apr 24 23:34:12.927047 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 24 23:34:12.928251 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Apr 24 23:34:12.928406 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Apr 24 23:34:12.928475 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 24 23:34:12.928544 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Apr 24 23:34:12.928610 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Apr 24 23:34:12.928675 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 24 23:34:12.928745 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Apr 24 23:34:12.928816 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Apr 24 23:34:12.928886 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 24 23:34:12.928898 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Apr 24 23:34:12.928992 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Apr 24 23:34:12.929067 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Apr 24 23:34:12.929153 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Apr 24 23:34:12.929166 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Apr 24 23:34:12.929177 kernel: ACPI: button: Power Button [PWRB] Apr 24 23:34:12.929186 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Apr 24 23:34:12.929259 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Apr 24 23:34:12.929333 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Apr 24 23:34:12.929344 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Apr 24 23:34:12.929353 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Apr 24 23:34:12.929425 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Apr 24 23:34:12.929436 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Apr 24 23:34:12.929444 kernel: thunder_xcv, ver 1.0 Apr 24 23:34:12.929455 kernel: thunder_bgx, ver 1.0 Apr 24 23:34:12.929462 kernel: nicpf, ver 1.0 Apr 24 23:34:12.929470 kernel: nicvf, ver 1.0 Apr 24 23:34:12.929555 kernel: rtc-efi rtc-efi.0: registered as rtc0 Apr 24 23:34:12.929621 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-04-24T23:34:12 UTC (1777073652) Apr 24 23:34:12.929632 kernel: hid: raw HID events driver (C) Jiri Kosina Apr 24 23:34:12.929640 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Apr 24 23:34:12.929648 kernel: watchdog: Delayed init of the lockup detector failed: -19 Apr 24 23:34:12.929658 kernel: watchdog: Hard watchdog permanently disabled Apr 24 23:34:12.929666 kernel: NET: Registered PF_INET6 protocol family Apr 24 23:34:12.929674 kernel: Segment Routing with IPv6 Apr 24 23:34:12.929682 kernel: In-situ OAM (IOAM) with IPv6 Apr 24 23:34:12.929690 kernel: NET: Registered PF_PACKET protocol family Apr 24 23:34:12.929697 kernel: Key type dns_resolver registered Apr 24 23:34:12.929705 kernel: registered taskstats version 1 Apr 24 23:34:12.929713 kernel: Loading compiled-in X.509 certificates Apr 24 23:34:12.929721 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: 96a6e7da7ac9a3ef656057ccd8e13f251b310c24' Apr 24 23:34:12.929730 kernel: Key type .fscrypt registered Apr 24 23:34:12.929738 kernel: Key type fscrypt-provisioning registered Apr 24 23:34:12.929746 kernel: ima: No TPM chip found, activating TPM-bypass! Apr 24 23:34:12.929754 kernel: ima: Allocated hash algorithm: sha1 Apr 24 23:34:12.929761 kernel: ima: No architecture policies found Apr 24 23:34:12.929769 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Apr 24 23:34:12.929777 kernel: clk: Disabling unused clocks Apr 24 23:34:12.929785 kernel: Freeing unused kernel memory: 39424K Apr 24 23:34:12.929793 kernel: Run /init as init process Apr 24 23:34:12.929802 kernel: with arguments: Apr 24 23:34:12.929810 kernel: /init Apr 24 23:34:12.929818 kernel: with environment: Apr 24 23:34:12.929825 kernel: HOME=/ Apr 24 23:34:12.929833 kernel: TERM=linux Apr 24 23:34:12.929843 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 24 23:34:12.929853 systemd[1]: Detected virtualization kvm. Apr 24 23:34:12.929862 systemd[1]: Detected architecture arm64. Apr 24 23:34:12.929871 systemd[1]: Running in initrd. Apr 24 23:34:12.929879 systemd[1]: No hostname configured, using default hostname. Apr 24 23:34:12.929887 systemd[1]: Hostname set to . Apr 24 23:34:12.929896 systemd[1]: Initializing machine ID from VM UUID. Apr 24 23:34:12.929914 systemd[1]: Queued start job for default target initrd.target. Apr 24 23:34:12.929926 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 24 23:34:12.929934 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 24 23:34:12.929943 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Apr 24 23:34:12.929954 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 24 23:34:12.929963 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Apr 24 23:34:12.929973 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Apr 24 23:34:12.929983 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Apr 24 23:34:12.929992 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Apr 24 23:34:12.930000 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 24 23:34:12.930008 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 24 23:34:12.930018 systemd[1]: Reached target paths.target - Path Units. Apr 24 23:34:12.930027 systemd[1]: Reached target slices.target - Slice Units. Apr 24 23:34:12.930035 systemd[1]: Reached target swap.target - Swaps. Apr 24 23:34:12.930043 systemd[1]: Reached target timers.target - Timer Units. Apr 24 23:34:12.930051 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Apr 24 23:34:12.930060 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 24 23:34:12.930068 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Apr 24 23:34:12.930077 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Apr 24 23:34:12.930086 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 24 23:34:12.930095 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 24 23:34:12.930103 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 24 23:34:12.930111 systemd[1]: Reached target sockets.target - Socket Units. Apr 24 23:34:12.932338 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Apr 24 23:34:12.932350 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 24 23:34:12.932359 systemd[1]: Finished network-cleanup.service - Network Cleanup. Apr 24 23:34:12.932367 systemd[1]: Starting systemd-fsck-usr.service... Apr 24 23:34:12.932376 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 24 23:34:12.932392 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 24 23:34:12.932401 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:34:12.932409 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Apr 24 23:34:12.932417 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 24 23:34:12.932469 systemd-journald[237]: Collecting audit messages is disabled. Apr 24 23:34:12.932493 systemd[1]: Finished systemd-fsck-usr.service. Apr 24 23:34:12.932502 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Apr 24 23:34:12.932511 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:34:12.932522 systemd-journald[237]: Journal started Apr 24 23:34:12.932542 systemd-journald[237]: Runtime Journal (/run/log/journal/51e226babde9462fa7f47557cc201d87) is 8.0M, max 76.6M, 68.6M free. Apr 24 23:34:12.924099 systemd-modules-load[238]: Inserted module 'overlay' Apr 24 23:34:12.936565 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 24 23:34:12.938222 systemd[1]: Started systemd-journald.service - Journal Service. Apr 24 23:34:12.938765 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Apr 24 23:34:12.945157 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Apr 24 23:34:12.947416 systemd-modules-load[238]: Inserted module 'br_netfilter' Apr 24 23:34:12.949593 kernel: Bridge firewalling registered Apr 24 23:34:12.948452 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 24 23:34:12.951014 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 24 23:34:12.953246 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 24 23:34:12.964650 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 24 23:34:12.968721 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 24 23:34:12.977424 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 24 23:34:12.982340 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:34:12.988406 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Apr 24 23:34:12.989690 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 24 23:34:13.002467 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 24 23:34:13.021195 dracut-cmdline[273]: dracut-dracut-053 Apr 24 23:34:13.023166 dracut-cmdline[273]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=63304dd98a277d4592d17e0085ae3f91ca70cc8ec6dedfdd357a1e9755f9a8b3 Apr 24 23:34:13.042768 systemd-resolved[276]: Positive Trust Anchors: Apr 24 23:34:13.044341 systemd-resolved[276]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 24 23:34:13.045145 systemd-resolved[276]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 24 23:34:13.054474 systemd-resolved[276]: Defaulting to hostname 'linux'. Apr 24 23:34:13.056431 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 24 23:34:13.057173 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 24 23:34:13.109217 kernel: SCSI subsystem initialized Apr 24 23:34:13.113155 kernel: Loading iSCSI transport class v2.0-870. Apr 24 23:34:13.121654 kernel: iscsi: registered transport (tcp) Apr 24 23:34:13.135229 kernel: iscsi: registered transport (qla4xxx) Apr 24 23:34:13.135351 kernel: QLogic iSCSI HBA Driver Apr 24 23:34:13.191387 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Apr 24 23:34:13.197368 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Apr 24 23:34:13.219224 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Apr 24 23:34:13.219295 kernel: device-mapper: uevent: version 1.0.3 Apr 24 23:34:13.220134 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Apr 24 23:34:13.270179 kernel: raid6: neonx8 gen() 15656 MB/s Apr 24 23:34:13.290624 kernel: raid6: neonx4 gen() 15578 MB/s Apr 24 23:34:13.305045 kernel: raid6: neonx2 gen() 13081 MB/s Apr 24 23:34:13.321183 kernel: raid6: neonx1 gen() 10397 MB/s Apr 24 23:34:13.338186 kernel: raid6: int64x8 gen() 6908 MB/s Apr 24 23:34:13.355208 kernel: raid6: int64x4 gen() 7311 MB/s Apr 24 23:34:13.372232 kernel: raid6: int64x2 gen() 6102 MB/s Apr 24 23:34:13.390807 kernel: raid6: int64x1 gen() 5009 MB/s Apr 24 23:34:13.390925 kernel: raid6: using algorithm neonx8 gen() 15656 MB/s Apr 24 23:34:13.406200 kernel: raid6: .... xor() 11804 MB/s, rmw enabled Apr 24 23:34:13.406291 kernel: raid6: using neon recovery algorithm Apr 24 23:34:13.411160 kernel: xor: measuring software checksum speed Apr 24 23:34:13.411237 kernel: 8regs : 17643 MB/sec Apr 24 23:34:13.412338 kernel: 32regs : 19622 MB/sec Apr 24 23:34:13.412362 kernel: arm64_neon : 23085 MB/sec Apr 24 23:34:13.412390 kernel: xor: using function: arm64_neon (23085 MB/sec) Apr 24 23:34:13.464184 kernel: Btrfs loaded, zoned=no, fsverity=no Apr 24 23:34:13.483215 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Apr 24 23:34:13.489330 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 24 23:34:13.513167 systemd-udevd[458]: Using default interface naming scheme 'v255'. Apr 24 23:34:13.516790 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 24 23:34:13.524584 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Apr 24 23:34:13.545227 dracut-pre-trigger[465]: rd.md=0: removing MD RAID activation Apr 24 23:34:13.588958 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Apr 24 23:34:13.596396 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 24 23:34:13.659403 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 24 23:34:13.667348 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Apr 24 23:34:13.689362 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Apr 24 23:34:13.693952 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Apr 24 23:34:13.695280 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 24 23:34:13.696517 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 24 23:34:13.705477 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Apr 24 23:34:13.725433 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Apr 24 23:34:13.791480 kernel: scsi host0: Virtio SCSI HBA Apr 24 23:34:13.801677 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Apr 24 23:34:13.801761 kernel: ACPI: bus type USB registered Apr 24 23:34:13.802578 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Apr 24 23:34:13.802634 kernel: usbcore: registered new interface driver usbfs Apr 24 23:34:13.805308 kernel: usbcore: registered new interface driver hub Apr 24 23:34:13.806226 kernel: usbcore: registered new device driver usb Apr 24 23:34:13.815220 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 24 23:34:13.815356 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:34:13.817153 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 24 23:34:13.817738 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 24 23:34:13.817876 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:34:13.821153 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:34:13.830165 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:34:13.848147 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Apr 24 23:34:13.848392 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Apr 24 23:34:13.853154 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Apr 24 23:34:13.854467 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:34:13.858788 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Apr 24 23:34:13.859188 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Apr 24 23:34:13.859285 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Apr 24 23:34:13.859367 kernel: sr 0:0:0:0: Power-on or device reset occurred Apr 24 23:34:13.862852 kernel: hub 1-0:1.0: USB hub found Apr 24 23:34:13.863094 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Apr 24 23:34:13.863246 kernel: hub 1-0:1.0: 4 ports detected Apr 24 23:34:13.863334 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Apr 24 23:34:13.865442 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Apr 24 23:34:13.865623 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Apr 24 23:34:13.865377 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Apr 24 23:34:13.870189 kernel: hub 2-0:1.0: USB hub found Apr 24 23:34:13.870414 kernel: hub 2-0:1.0: 4 ports detected Apr 24 23:34:13.884166 kernel: sd 0:0:0:1: Power-on or device reset occurred Apr 24 23:34:13.885582 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Apr 24 23:34:13.885687 kernel: sd 0:0:0:1: [sda] Write Protect is off Apr 24 23:34:13.887964 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Apr 24 23:34:13.890143 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Apr 24 23:34:13.895162 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Apr 24 23:34:13.895229 kernel: GPT:17805311 != 80003071 Apr 24 23:34:13.895240 kernel: GPT:Alternate GPT header not at the end of the disk. Apr 24 23:34:13.895259 kernel: GPT:17805311 != 80003071 Apr 24 23:34:13.895567 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:34:13.898501 kernel: GPT: Use GNU Parted to correct GPT errors. Apr 24 23:34:13.898523 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 24 23:34:13.898533 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Apr 24 23:34:13.933156 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (509) Apr 24 23:34:13.937170 kernel: BTRFS: device fsid 5f4cf890-f9e2-4e04-aa84-1bcfb6e5643e devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (526) Apr 24 23:34:13.943196 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Apr 24 23:34:13.961442 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Apr 24 23:34:13.966727 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Apr 24 23:34:13.971553 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Apr 24 23:34:13.972258 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Apr 24 23:34:13.986396 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Apr 24 23:34:13.995599 disk-uuid[574]: Primary Header is updated. Apr 24 23:34:13.995599 disk-uuid[574]: Secondary Entries is updated. Apr 24 23:34:13.995599 disk-uuid[574]: Secondary Header is updated. Apr 24 23:34:14.004282 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 24 23:34:14.011149 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 24 23:34:14.017219 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 24 23:34:14.105439 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Apr 24 23:34:14.241143 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Apr 24 23:34:14.242415 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Apr 24 23:34:14.242613 kernel: usbcore: registered new interface driver usbhid Apr 24 23:34:14.243128 kernel: usbhid: USB HID core driver Apr 24 23:34:14.351172 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Apr 24 23:34:14.481158 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Apr 24 23:34:14.534253 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Apr 24 23:34:15.022954 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Apr 24 23:34:15.023014 disk-uuid[575]: The operation has completed successfully. Apr 24 23:34:15.084017 systemd[1]: disk-uuid.service: Deactivated successfully. Apr 24 23:34:15.084187 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Apr 24 23:34:15.102475 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Apr 24 23:34:15.106774 sh[593]: Success Apr 24 23:34:15.122142 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Apr 24 23:34:15.183635 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Apr 24 23:34:15.187367 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Apr 24 23:34:15.190257 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Apr 24 23:34:15.211679 kernel: BTRFS info (device dm-0): first mount of filesystem 5f4cf890-f9e2-4e04-aa84-1bcfb6e5643e Apr 24 23:34:15.211755 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Apr 24 23:34:15.211772 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Apr 24 23:34:15.213272 kernel: BTRFS info (device dm-0): disabling log replay at mount time Apr 24 23:34:15.213303 kernel: BTRFS info (device dm-0): using free space tree Apr 24 23:34:15.221265 kernel: BTRFS info (device dm-0): enabling ssd optimizations Apr 24 23:34:15.223823 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Apr 24 23:34:15.226035 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Apr 24 23:34:15.235425 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Apr 24 23:34:15.239367 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Apr 24 23:34:15.254472 kernel: BTRFS info (device sda6): first mount of filesystem 7d1fb622-285b-4375-96d6-a0d989283452 Apr 24 23:34:15.254536 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Apr 24 23:34:15.254548 kernel: BTRFS info (device sda6): using free space tree Apr 24 23:34:15.262167 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 24 23:34:15.262256 kernel: BTRFS info (device sda6): auto enabling async discard Apr 24 23:34:15.273429 systemd[1]: mnt-oem.mount: Deactivated successfully. Apr 24 23:34:15.275348 kernel: BTRFS info (device sda6): last unmount of filesystem 7d1fb622-285b-4375-96d6-a0d989283452 Apr 24 23:34:15.282985 systemd[1]: Finished ignition-setup.service - Ignition (setup). Apr 24 23:34:15.290261 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Apr 24 23:34:15.370235 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 24 23:34:15.382391 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 24 23:34:15.398330 ignition[683]: Ignition 2.19.0 Apr 24 23:34:15.398345 ignition[683]: Stage: fetch-offline Apr 24 23:34:15.398381 ignition[683]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:34:15.398389 ignition[683]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:34:15.398541 ignition[683]: parsed url from cmdline: "" Apr 24 23:34:15.401464 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Apr 24 23:34:15.398544 ignition[683]: no config URL provided Apr 24 23:34:15.398547 ignition[683]: reading system config file "/usr/lib/ignition/user.ign" Apr 24 23:34:15.398554 ignition[683]: no config at "/usr/lib/ignition/user.ign" Apr 24 23:34:15.398558 ignition[683]: failed to fetch config: resource requires networking Apr 24 23:34:15.398744 ignition[683]: Ignition finished successfully Apr 24 23:34:15.406972 systemd-networkd[779]: lo: Link UP Apr 24 23:34:15.406975 systemd-networkd[779]: lo: Gained carrier Apr 24 23:34:15.408536 systemd-networkd[779]: Enumeration completed Apr 24 23:34:15.409009 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:34:15.409012 systemd-networkd[779]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 24 23:34:15.409365 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 24 23:34:15.410051 systemd-networkd[779]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:34:15.410054 systemd-networkd[779]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 24 23:34:15.410227 systemd[1]: Reached target network.target - Network. Apr 24 23:34:15.410605 systemd-networkd[779]: eth0: Link UP Apr 24 23:34:15.410609 systemd-networkd[779]: eth0: Gained carrier Apr 24 23:34:15.410616 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:34:15.415435 systemd-networkd[779]: eth1: Link UP Apr 24 23:34:15.415439 systemd-networkd[779]: eth1: Gained carrier Apr 24 23:34:15.415449 systemd-networkd[779]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:34:15.418340 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Apr 24 23:34:15.434465 ignition[784]: Ignition 2.19.0 Apr 24 23:34:15.434476 ignition[784]: Stage: fetch Apr 24 23:34:15.434676 ignition[784]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:34:15.434686 ignition[784]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:34:15.434784 ignition[784]: parsed url from cmdline: "" Apr 24 23:34:15.434787 ignition[784]: no config URL provided Apr 24 23:34:15.434792 ignition[784]: reading system config file "/usr/lib/ignition/user.ign" Apr 24 23:34:15.434799 ignition[784]: no config at "/usr/lib/ignition/user.ign" Apr 24 23:34:15.434820 ignition[784]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Apr 24 23:34:15.435334 ignition[784]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Apr 24 23:34:15.458236 systemd-networkd[779]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Apr 24 23:34:15.468266 systemd-networkd[779]: eth0: DHCPv4 address 178.105.28.58/32, gateway 172.31.1.1 acquired from 172.31.1.1 Apr 24 23:34:15.635435 ignition[784]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Apr 24 23:34:15.642354 ignition[784]: GET result: OK Apr 24 23:34:15.642528 ignition[784]: parsing config with SHA512: 5e9625b4aaeb9ea3e14a35351aefe4a3655a0f7bdafef2c315f988650464d83940e60fda837a816264c48d4abcc9fccb91ae9daabb087c34a7b3077532e92f4e Apr 24 23:34:15.650310 unknown[784]: fetched base config from "system" Apr 24 23:34:15.650934 ignition[784]: fetch: fetch complete Apr 24 23:34:15.650333 unknown[784]: fetched base config from "system" Apr 24 23:34:15.650943 ignition[784]: fetch: fetch passed Apr 24 23:34:15.650339 unknown[784]: fetched user config from "hetzner" Apr 24 23:34:15.651040 ignition[784]: Ignition finished successfully Apr 24 23:34:15.653155 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Apr 24 23:34:15.665405 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Apr 24 23:34:15.678558 ignition[792]: Ignition 2.19.0 Apr 24 23:34:15.678570 ignition[792]: Stage: kargs Apr 24 23:34:15.678762 ignition[792]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:34:15.678772 ignition[792]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:34:15.679765 ignition[792]: kargs: kargs passed Apr 24 23:34:15.679817 ignition[792]: Ignition finished successfully Apr 24 23:34:15.683861 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Apr 24 23:34:15.696963 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Apr 24 23:34:15.711027 ignition[798]: Ignition 2.19.0 Apr 24 23:34:15.711039 ignition[798]: Stage: disks Apr 24 23:34:15.711301 ignition[798]: no configs at "/usr/lib/ignition/base.d" Apr 24 23:34:15.711312 ignition[798]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:34:15.712269 ignition[798]: disks: disks passed Apr 24 23:34:15.712326 ignition[798]: Ignition finished successfully Apr 24 23:34:15.715091 systemd[1]: Finished ignition-disks.service - Ignition (disks). Apr 24 23:34:15.716494 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Apr 24 23:34:15.718223 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Apr 24 23:34:15.719751 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 24 23:34:15.721059 systemd[1]: Reached target sysinit.target - System Initialization. Apr 24 23:34:15.722419 systemd[1]: Reached target basic.target - Basic System. Apr 24 23:34:15.730421 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Apr 24 23:34:15.753769 systemd-fsck[806]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Apr 24 23:34:15.758017 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Apr 24 23:34:15.768335 systemd[1]: Mounting sysroot.mount - /sysroot... Apr 24 23:34:15.822181 kernel: EXT4-fs (sda9): mounted filesystem edaa698b-3baa-4242-8691-64cb9f35f18f r/w with ordered data mode. Quota mode: none. Apr 24 23:34:15.822110 systemd[1]: Mounted sysroot.mount - /sysroot. Apr 24 23:34:15.823161 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Apr 24 23:34:15.836323 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 24 23:34:15.840296 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Apr 24 23:34:15.844455 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Apr 24 23:34:15.845285 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Apr 24 23:34:15.845317 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Apr 24 23:34:15.853434 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Apr 24 23:34:15.855347 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Apr 24 23:34:15.858151 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (814) Apr 24 23:34:15.861147 kernel: BTRFS info (device sda6): first mount of filesystem 7d1fb622-285b-4375-96d6-a0d989283452 Apr 24 23:34:15.861195 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Apr 24 23:34:15.861207 kernel: BTRFS info (device sda6): using free space tree Apr 24 23:34:15.867135 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 24 23:34:15.867201 kernel: BTRFS info (device sda6): auto enabling async discard Apr 24 23:34:15.872722 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 24 23:34:15.923142 initrd-setup-root[842]: cut: /sysroot/etc/passwd: No such file or directory Apr 24 23:34:15.927833 coreos-metadata[816]: Apr 24 23:34:15.927 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Apr 24 23:34:15.929923 initrd-setup-root[849]: cut: /sysroot/etc/group: No such file or directory Apr 24 23:34:15.930888 coreos-metadata[816]: Apr 24 23:34:15.930 INFO Fetch successful Apr 24 23:34:15.933169 coreos-metadata[816]: Apr 24 23:34:15.933 INFO wrote hostname ci-4081-3-6-n-0494d1f24d to /sysroot/etc/hostname Apr 24 23:34:15.936409 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 24 23:34:15.938547 initrd-setup-root[857]: cut: /sysroot/etc/shadow: No such file or directory Apr 24 23:34:15.944184 initrd-setup-root[864]: cut: /sysroot/etc/gshadow: No such file or directory Apr 24 23:34:16.043936 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Apr 24 23:34:16.049915 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Apr 24 23:34:16.064317 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Apr 24 23:34:16.072168 kernel: BTRFS info (device sda6): last unmount of filesystem 7d1fb622-285b-4375-96d6-a0d989283452 Apr 24 23:34:16.092602 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Apr 24 23:34:16.100429 ignition[932]: INFO : Ignition 2.19.0 Apr 24 23:34:16.100429 ignition[932]: INFO : Stage: mount Apr 24 23:34:16.101672 ignition[932]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 24 23:34:16.101672 ignition[932]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:34:16.101672 ignition[932]: INFO : mount: mount passed Apr 24 23:34:16.104382 ignition[932]: INFO : Ignition finished successfully Apr 24 23:34:16.106202 systemd[1]: Finished ignition-mount.service - Ignition (mount). Apr 24 23:34:16.114626 systemd[1]: Starting ignition-files.service - Ignition (files)... Apr 24 23:34:16.211791 systemd[1]: sysroot-oem.mount: Deactivated successfully. Apr 24 23:34:16.220407 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Apr 24 23:34:16.232544 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (944) Apr 24 23:34:16.232627 kernel: BTRFS info (device sda6): first mount of filesystem 7d1fb622-285b-4375-96d6-a0d989283452 Apr 24 23:34:16.234175 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Apr 24 23:34:16.234213 kernel: BTRFS info (device sda6): using free space tree Apr 24 23:34:16.238167 kernel: BTRFS info (device sda6): enabling ssd optimizations Apr 24 23:34:16.238238 kernel: BTRFS info (device sda6): auto enabling async discard Apr 24 23:34:16.240698 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Apr 24 23:34:16.271414 ignition[961]: INFO : Ignition 2.19.0 Apr 24 23:34:16.272220 ignition[961]: INFO : Stage: files Apr 24 23:34:16.272848 ignition[961]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 24 23:34:16.273557 ignition[961]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:34:16.275481 ignition[961]: DEBUG : files: compiled without relabeling support, skipping Apr 24 23:34:16.277500 ignition[961]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Apr 24 23:34:16.278290 ignition[961]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Apr 24 23:34:16.282627 ignition[961]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Apr 24 23:34:16.284444 ignition[961]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Apr 24 23:34:16.286073 unknown[961]: wrote ssh authorized keys file for user: core Apr 24 23:34:16.288878 ignition[961]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Apr 24 23:34:16.291778 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Apr 24 23:34:16.291778 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Apr 24 23:34:16.394305 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Apr 24 23:34:16.482679 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Apr 24 23:34:16.482679 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Apr 24 23:34:16.485452 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-arm64.raw: attempt #1 Apr 24 23:34:16.666329 systemd-networkd[779]: eth0: Gained IPv6LL Apr 24 23:34:16.790271 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Apr 24 23:34:17.178382 systemd-networkd[779]: eth1: Gained IPv6LL Apr 24 23:34:17.956596 ignition[961]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-arm64.raw" Apr 24 23:34:17.956596 ignition[961]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Apr 24 23:34:17.959836 ignition[961]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 24 23:34:17.959836 ignition[961]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Apr 24 23:34:17.959836 ignition[961]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Apr 24 23:34:17.959836 ignition[961]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Apr 24 23:34:17.959836 ignition[961]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Apr 24 23:34:17.959836 ignition[961]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Apr 24 23:34:17.959836 ignition[961]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Apr 24 23:34:17.959836 ignition[961]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Apr 24 23:34:17.959836 ignition[961]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Apr 24 23:34:17.959836 ignition[961]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Apr 24 23:34:17.959836 ignition[961]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Apr 24 23:34:17.959836 ignition[961]: INFO : files: files passed Apr 24 23:34:17.959836 ignition[961]: INFO : Ignition finished successfully Apr 24 23:34:17.960695 systemd[1]: Finished ignition-files.service - Ignition (files). Apr 24 23:34:17.968306 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Apr 24 23:34:17.969789 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Apr 24 23:34:17.978724 systemd[1]: ignition-quench.service: Deactivated successfully. Apr 24 23:34:17.978904 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Apr 24 23:34:17.991205 initrd-setup-root-after-ignition[989]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 24 23:34:17.991205 initrd-setup-root-after-ignition[989]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Apr 24 23:34:17.994169 initrd-setup-root-after-ignition[993]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Apr 24 23:34:17.996653 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 24 23:34:17.998471 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Apr 24 23:34:18.004370 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Apr 24 23:34:18.030286 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Apr 24 23:34:18.031544 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Apr 24 23:34:18.032908 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Apr 24 23:34:18.035320 systemd[1]: Reached target initrd.target - Initrd Default Target. Apr 24 23:34:18.036234 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Apr 24 23:34:18.043459 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Apr 24 23:34:18.061730 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 24 23:34:18.072440 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Apr 24 23:34:18.082719 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Apr 24 23:34:18.084270 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 24 23:34:18.084986 systemd[1]: Stopped target timers.target - Timer Units. Apr 24 23:34:18.087077 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Apr 24 23:34:18.087309 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Apr 24 23:34:18.089357 systemd[1]: Stopped target initrd.target - Initrd Default Target. Apr 24 23:34:18.090928 systemd[1]: Stopped target basic.target - Basic System. Apr 24 23:34:18.092233 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Apr 24 23:34:18.093429 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Apr 24 23:34:18.094566 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Apr 24 23:34:18.095874 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Apr 24 23:34:18.096913 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Apr 24 23:34:18.098084 systemd[1]: Stopped target sysinit.target - System Initialization. Apr 24 23:34:18.099345 systemd[1]: Stopped target local-fs.target - Local File Systems. Apr 24 23:34:18.100402 systemd[1]: Stopped target swap.target - Swaps. Apr 24 23:34:18.101298 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Apr 24 23:34:18.101428 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Apr 24 23:34:18.102708 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Apr 24 23:34:18.103415 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 24 23:34:18.104605 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Apr 24 23:34:18.106267 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 24 23:34:18.106990 systemd[1]: dracut-initqueue.service: Deactivated successfully. Apr 24 23:34:18.107142 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Apr 24 23:34:18.108820 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Apr 24 23:34:18.108956 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Apr 24 23:34:18.110224 systemd[1]: ignition-files.service: Deactivated successfully. Apr 24 23:34:18.110326 systemd[1]: Stopped ignition-files.service - Ignition (files). Apr 24 23:34:18.111430 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Apr 24 23:34:18.111527 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Apr 24 23:34:18.121436 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Apr 24 23:34:18.122042 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Apr 24 23:34:18.122210 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Apr 24 23:34:18.127388 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Apr 24 23:34:18.127960 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Apr 24 23:34:18.128100 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Apr 24 23:34:18.130952 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Apr 24 23:34:18.131401 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Apr 24 23:34:18.143525 systemd[1]: initrd-cleanup.service: Deactivated successfully. Apr 24 23:34:18.143743 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Apr 24 23:34:18.148202 ignition[1013]: INFO : Ignition 2.19.0 Apr 24 23:34:18.148202 ignition[1013]: INFO : Stage: umount Apr 24 23:34:18.148202 ignition[1013]: INFO : no configs at "/usr/lib/ignition/base.d" Apr 24 23:34:18.148202 ignition[1013]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Apr 24 23:34:18.153058 ignition[1013]: INFO : umount: umount passed Apr 24 23:34:18.153058 ignition[1013]: INFO : Ignition finished successfully Apr 24 23:34:18.153035 systemd[1]: ignition-mount.service: Deactivated successfully. Apr 24 23:34:18.155164 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Apr 24 23:34:18.156920 systemd[1]: ignition-disks.service: Deactivated successfully. Apr 24 23:34:18.156976 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Apr 24 23:34:18.157865 systemd[1]: ignition-kargs.service: Deactivated successfully. Apr 24 23:34:18.157920 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Apr 24 23:34:18.159295 systemd[1]: ignition-fetch.service: Deactivated successfully. Apr 24 23:34:18.159351 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Apr 24 23:34:18.160333 systemd[1]: Stopped target network.target - Network. Apr 24 23:34:18.160887 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Apr 24 23:34:18.160938 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Apr 24 23:34:18.164307 systemd[1]: Stopped target paths.target - Path Units. Apr 24 23:34:18.165323 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Apr 24 23:34:18.169221 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 24 23:34:18.172297 systemd[1]: Stopped target slices.target - Slice Units. Apr 24 23:34:18.172895 systemd[1]: Stopped target sockets.target - Socket Units. Apr 24 23:34:18.173594 systemd[1]: iscsid.socket: Deactivated successfully. Apr 24 23:34:18.173641 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Apr 24 23:34:18.175327 systemd[1]: iscsiuio.socket: Deactivated successfully. Apr 24 23:34:18.175399 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Apr 24 23:34:18.176799 systemd[1]: ignition-setup.service: Deactivated successfully. Apr 24 23:34:18.176902 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Apr 24 23:34:18.178485 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Apr 24 23:34:18.178533 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Apr 24 23:34:18.179681 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Apr 24 23:34:18.181350 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Apr 24 23:34:18.183748 systemd[1]: sysroot-boot.mount: Deactivated successfully. Apr 24 23:34:18.184176 systemd-networkd[779]: eth1: DHCPv6 lease lost Apr 24 23:34:18.184545 systemd[1]: sysroot-boot.service: Deactivated successfully. Apr 24 23:34:18.184639 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Apr 24 23:34:18.185694 systemd[1]: initrd-setup-root.service: Deactivated successfully. Apr 24 23:34:18.185792 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Apr 24 23:34:18.188218 systemd-networkd[779]: eth0: DHCPv6 lease lost Apr 24 23:34:18.190048 systemd[1]: systemd-networkd.service: Deactivated successfully. Apr 24 23:34:18.190227 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Apr 24 23:34:18.191296 systemd[1]: systemd-networkd.socket: Deactivated successfully. Apr 24 23:34:18.191330 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Apr 24 23:34:18.199343 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Apr 24 23:34:18.199864 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Apr 24 23:34:18.199934 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Apr 24 23:34:18.203144 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 24 23:34:18.204046 systemd[1]: systemd-resolved.service: Deactivated successfully. Apr 24 23:34:18.204178 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Apr 24 23:34:18.213208 systemd[1]: systemd-sysctl.service: Deactivated successfully. Apr 24 23:34:18.213316 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Apr 24 23:34:18.214003 systemd[1]: systemd-modules-load.service: Deactivated successfully. Apr 24 23:34:18.214045 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Apr 24 23:34:18.216091 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Apr 24 23:34:18.216157 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 24 23:34:18.225050 systemd[1]: systemd-udevd.service: Deactivated successfully. Apr 24 23:34:18.225364 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 24 23:34:18.228318 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Apr 24 23:34:18.228386 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Apr 24 23:34:18.229099 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Apr 24 23:34:18.229424 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Apr 24 23:34:18.230449 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Apr 24 23:34:18.230497 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Apr 24 23:34:18.232108 systemd[1]: dracut-cmdline.service: Deactivated successfully. Apr 24 23:34:18.232181 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Apr 24 23:34:18.233709 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Apr 24 23:34:18.233759 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Apr 24 23:34:18.240395 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Apr 24 23:34:18.240989 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Apr 24 23:34:18.241057 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 24 23:34:18.242589 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Apr 24 23:34:18.242635 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:34:18.244412 systemd[1]: network-cleanup.service: Deactivated successfully. Apr 24 23:34:18.244533 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Apr 24 23:34:18.254716 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Apr 24 23:34:18.254916 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Apr 24 23:34:18.257539 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Apr 24 23:34:18.267379 systemd[1]: Starting initrd-switch-root.service - Switch Root... Apr 24 23:34:18.276649 systemd[1]: Switching root. Apr 24 23:34:18.302101 systemd-journald[237]: Journal stopped Apr 24 23:34:19.301624 systemd-journald[237]: Received SIGTERM from PID 1 (systemd). Apr 24 23:34:19.301688 kernel: SELinux: policy capability network_peer_controls=1 Apr 24 23:34:19.301704 kernel: SELinux: policy capability open_perms=1 Apr 24 23:34:19.301714 kernel: SELinux: policy capability extended_socket_class=1 Apr 24 23:34:19.301723 kernel: SELinux: policy capability always_check_network=0 Apr 24 23:34:19.301732 kernel: SELinux: policy capability cgroup_seclabel=1 Apr 24 23:34:19.301746 kernel: SELinux: policy capability nnp_nosuid_transition=1 Apr 24 23:34:19.301763 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Apr 24 23:34:19.301774 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Apr 24 23:34:19.301783 kernel: audit: type=1403 audit(1777073658.503:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Apr 24 23:34:19.301794 systemd[1]: Successfully loaded SELinux policy in 36.016ms. Apr 24 23:34:19.301814 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 11.790ms. Apr 24 23:34:19.301826 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Apr 24 23:34:19.301844 systemd[1]: Detected virtualization kvm. Apr 24 23:34:19.301858 systemd[1]: Detected architecture arm64. Apr 24 23:34:19.301868 systemd[1]: Detected first boot. Apr 24 23:34:19.301880 systemd[1]: Hostname set to . Apr 24 23:34:19.301890 systemd[1]: Initializing machine ID from VM UUID. Apr 24 23:34:19.301901 zram_generator::config[1055]: No configuration found. Apr 24 23:34:19.301912 systemd[1]: Populated /etc with preset unit settings. Apr 24 23:34:19.301922 systemd[1]: initrd-switch-root.service: Deactivated successfully. Apr 24 23:34:19.301932 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Apr 24 23:34:19.301944 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Apr 24 23:34:19.301955 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Apr 24 23:34:19.301966 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Apr 24 23:34:19.301976 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Apr 24 23:34:19.301986 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Apr 24 23:34:19.301999 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Apr 24 23:34:19.302010 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Apr 24 23:34:19.302020 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Apr 24 23:34:19.302032 systemd[1]: Created slice user.slice - User and Session Slice. Apr 24 23:34:19.302042 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Apr 24 23:34:19.302053 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Apr 24 23:34:19.302063 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Apr 24 23:34:19.302074 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Apr 24 23:34:19.302085 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Apr 24 23:34:19.302096 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Apr 24 23:34:19.302107 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Apr 24 23:34:19.305178 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Apr 24 23:34:19.305217 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Apr 24 23:34:19.305229 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Apr 24 23:34:19.305239 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Apr 24 23:34:19.305250 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Apr 24 23:34:19.305261 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Apr 24 23:34:19.305272 systemd[1]: Reached target remote-fs.target - Remote File Systems. Apr 24 23:34:19.305283 systemd[1]: Reached target slices.target - Slice Units. Apr 24 23:34:19.305295 systemd[1]: Reached target swap.target - Swaps. Apr 24 23:34:19.305305 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Apr 24 23:34:19.305316 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Apr 24 23:34:19.305326 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Apr 24 23:34:19.305337 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Apr 24 23:34:19.305350 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Apr 24 23:34:19.305362 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Apr 24 23:34:19.305372 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Apr 24 23:34:19.305383 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Apr 24 23:34:19.305395 systemd[1]: Mounting media.mount - External Media Directory... Apr 24 23:34:19.305411 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Apr 24 23:34:19.305421 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Apr 24 23:34:19.305434 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Apr 24 23:34:19.305445 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Apr 24 23:34:19.305456 systemd[1]: Reached target machines.target - Containers. Apr 24 23:34:19.305466 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Apr 24 23:34:19.305481 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 24 23:34:19.305493 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Apr 24 23:34:19.305506 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Apr 24 23:34:19.305519 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 24 23:34:19.305531 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 24 23:34:19.305542 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 24 23:34:19.305552 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Apr 24 23:34:19.305564 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 24 23:34:19.305575 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Apr 24 23:34:19.305591 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Apr 24 23:34:19.305601 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Apr 24 23:34:19.305612 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Apr 24 23:34:19.305622 systemd[1]: Stopped systemd-fsck-usr.service. Apr 24 23:34:19.305632 systemd[1]: Starting systemd-journald.service - Journal Service... Apr 24 23:34:19.305643 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Apr 24 23:34:19.305654 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Apr 24 23:34:19.305666 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Apr 24 23:34:19.305678 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Apr 24 23:34:19.305688 systemd[1]: verity-setup.service: Deactivated successfully. Apr 24 23:34:19.305699 systemd[1]: Stopped verity-setup.service. Apr 24 23:34:19.305709 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Apr 24 23:34:19.305719 kernel: loop: module loaded Apr 24 23:34:19.305730 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Apr 24 23:34:19.305740 systemd[1]: Mounted media.mount - External Media Directory. Apr 24 23:34:19.305753 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Apr 24 23:34:19.305764 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Apr 24 23:34:19.305773 kernel: fuse: init (API version 7.39) Apr 24 23:34:19.305783 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Apr 24 23:34:19.305794 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Apr 24 23:34:19.305807 systemd[1]: modprobe@configfs.service: Deactivated successfully. Apr 24 23:34:19.305817 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Apr 24 23:34:19.305827 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 24 23:34:19.305850 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 24 23:34:19.305863 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 24 23:34:19.305874 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 24 23:34:19.305884 systemd[1]: modprobe@fuse.service: Deactivated successfully. Apr 24 23:34:19.305894 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Apr 24 23:34:19.305907 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 24 23:34:19.305920 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 24 23:34:19.305930 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Apr 24 23:34:19.305940 kernel: ACPI: bus type drm_connector registered Apr 24 23:34:19.305951 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Apr 24 23:34:19.305961 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 24 23:34:19.306007 systemd-journald[1124]: Collecting audit messages is disabled. Apr 24 23:34:19.306031 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 24 23:34:19.306042 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 24 23:34:19.306057 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Apr 24 23:34:19.306070 systemd-journald[1124]: Journal started Apr 24 23:34:19.306093 systemd-journald[1124]: Runtime Journal (/run/log/journal/51e226babde9462fa7f47557cc201d87) is 8.0M, max 76.6M, 68.6M free. Apr 24 23:34:19.002800 systemd[1]: Queued start job for default target multi-user.target. Apr 24 23:34:19.025297 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Apr 24 23:34:19.025763 systemd[1]: systemd-journald.service: Deactivated successfully. Apr 24 23:34:19.310168 systemd[1]: Started systemd-journald.service - Journal Service. Apr 24 23:34:19.311608 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Apr 24 23:34:19.313042 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Apr 24 23:34:19.314249 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Apr 24 23:34:19.315332 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Apr 24 23:34:19.332415 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Apr 24 23:34:19.338977 systemd[1]: Reached target network-pre.target - Preparation for Network. Apr 24 23:34:19.340256 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Apr 24 23:34:19.340297 systemd[1]: Reached target local-fs.target - Local File Systems. Apr 24 23:34:19.342091 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Apr 24 23:34:19.347423 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Apr 24 23:34:19.358337 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Apr 24 23:34:19.359629 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 24 23:34:19.361996 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Apr 24 23:34:19.365334 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Apr 24 23:34:19.368394 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 24 23:34:19.376554 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Apr 24 23:34:19.388618 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Apr 24 23:34:19.396392 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Apr 24 23:34:19.401450 systemd[1]: Starting systemd-sysusers.service - Create System Users... Apr 24 23:34:19.402970 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Apr 24 23:34:19.404830 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Apr 24 23:34:19.414297 systemd-journald[1124]: Time spent on flushing to /var/log/journal/51e226babde9462fa7f47557cc201d87 is 82.495ms for 1125 entries. Apr 24 23:34:19.414297 systemd-journald[1124]: System Journal (/var/log/journal/51e226babde9462fa7f47557cc201d87) is 8.0M, max 584.8M, 576.8M free. Apr 24 23:34:19.520037 systemd-journald[1124]: Received client request to flush runtime journal. Apr 24 23:34:19.520136 kernel: loop0: detected capacity change from 0 to 8 Apr 24 23:34:19.520170 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Apr 24 23:34:19.520187 kernel: loop1: detected capacity change from 0 to 197488 Apr 24 23:34:19.425408 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Apr 24 23:34:19.426730 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Apr 24 23:34:19.429098 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Apr 24 23:34:19.438440 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Apr 24 23:34:19.449482 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Apr 24 23:34:19.484179 udevadm[1176]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Apr 24 23:34:19.522191 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Apr 24 23:34:19.529019 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Apr 24 23:34:19.531761 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Apr 24 23:34:19.536811 systemd[1]: Finished systemd-sysusers.service - Create System Users. Apr 24 23:34:19.549078 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Apr 24 23:34:19.567088 kernel: loop2: detected capacity change from 0 to 114432 Apr 24 23:34:19.597438 systemd-tmpfiles[1189]: ACLs are not supported, ignoring. Apr 24 23:34:19.598022 systemd-tmpfiles[1189]: ACLs are not supported, ignoring. Apr 24 23:34:19.604906 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Apr 24 23:34:19.611708 kernel: loop3: detected capacity change from 0 to 114328 Apr 24 23:34:19.657413 kernel: loop4: detected capacity change from 0 to 8 Apr 24 23:34:19.662154 kernel: loop5: detected capacity change from 0 to 197488 Apr 24 23:34:19.700137 kernel: loop6: detected capacity change from 0 to 114432 Apr 24 23:34:19.725144 kernel: loop7: detected capacity change from 0 to 114328 Apr 24 23:34:19.744175 (sd-merge)[1194]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Apr 24 23:34:19.744793 (sd-merge)[1194]: Merged extensions into '/usr'. Apr 24 23:34:19.754293 systemd[1]: Reloading requested from client PID 1171 ('systemd-sysext') (unit systemd-sysext.service)... Apr 24 23:34:19.754312 systemd[1]: Reloading... Apr 24 23:34:19.875143 zram_generator::config[1221]: No configuration found. Apr 24 23:34:19.928776 ldconfig[1166]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Apr 24 23:34:20.022576 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:34:20.070164 systemd[1]: Reloading finished in 315 ms. Apr 24 23:34:20.097235 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Apr 24 23:34:20.100174 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Apr 24 23:34:20.101255 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Apr 24 23:34:20.111385 systemd[1]: Starting ensure-sysext.service... Apr 24 23:34:20.114366 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Apr 24 23:34:20.117327 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Apr 24 23:34:20.133684 systemd[1]: Reloading requested from client PID 1259 ('systemctl') (unit ensure-sysext.service)... Apr 24 23:34:20.133711 systemd[1]: Reloading... Apr 24 23:34:20.163400 systemd-udevd[1261]: Using default interface naming scheme 'v255'. Apr 24 23:34:20.173696 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Apr 24 23:34:20.174079 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Apr 24 23:34:20.178361 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Apr 24 23:34:20.178692 systemd-tmpfiles[1260]: ACLs are not supported, ignoring. Apr 24 23:34:20.178744 systemd-tmpfiles[1260]: ACLs are not supported, ignoring. Apr 24 23:34:20.185913 systemd-tmpfiles[1260]: Detected autofs mount point /boot during canonicalization of boot. Apr 24 23:34:20.185927 systemd-tmpfiles[1260]: Skipping /boot Apr 24 23:34:20.214885 systemd-tmpfiles[1260]: Detected autofs mount point /boot during canonicalization of boot. Apr 24 23:34:20.214900 systemd-tmpfiles[1260]: Skipping /boot Apr 24 23:34:20.281381 zram_generator::config[1302]: No configuration found. Apr 24 23:34:20.332141 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1293) Apr 24 23:34:20.462252 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:34:20.526489 kernel: mousedev: PS/2 mouse device common for all mice Apr 24 23:34:20.568487 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Apr 24 23:34:20.568933 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Apr 24 23:34:20.570424 systemd[1]: Reloading finished in 436 ms. Apr 24 23:34:20.580027 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Apr 24 23:34:20.581223 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Apr 24 23:34:20.606588 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Apr 24 23:34:20.616451 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 24 23:34:20.621363 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Apr 24 23:34:20.622557 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 24 23:34:20.629343 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 24 23:34:20.636428 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 24 23:34:20.638079 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Apr 24 23:34:20.638158 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Apr 24 23:34:20.638174 kernel: [drm] features: -context_init Apr 24 23:34:20.641150 kernel: [drm] number of scanouts: 1 Apr 24 23:34:20.641221 kernel: [drm] number of cap sets: 0 Apr 24 23:34:20.641703 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 24 23:34:20.643524 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 24 23:34:20.655194 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Apr 24 23:34:20.655766 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Apr 24 23:34:20.661747 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Apr 24 23:34:20.666899 kernel: Console: switching to colour frame buffer device 160x50 Apr 24 23:34:20.673704 systemd[1]: Starting systemd-networkd.service - Network Configuration... Apr 24 23:34:20.686124 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Apr 24 23:34:20.697870 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Apr 24 23:34:20.706435 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Apr 24 23:34:20.708738 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 24 23:34:20.710277 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 24 23:34:20.711548 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 24 23:34:20.713176 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 24 23:34:20.715046 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 24 23:34:20.717243 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 24 23:34:20.719750 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Apr 24 23:34:20.736461 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Apr 24 23:34:20.743361 augenrules[1399]: No rules Apr 24 23:34:20.747506 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 24 23:34:20.757311 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Apr 24 23:34:20.767544 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Apr 24 23:34:20.776513 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Apr 24 23:34:20.780753 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Apr 24 23:34:20.785683 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Apr 24 23:34:20.791491 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Apr 24 23:34:20.792232 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Apr 24 23:34:20.798293 systemd[1]: Starting systemd-update-done.service - Update is Completed... Apr 24 23:34:20.803139 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Apr 24 23:34:20.807234 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Apr 24 23:34:20.809279 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Apr 24 23:34:20.812794 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Apr 24 23:34:20.814192 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Apr 24 23:34:20.814368 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Apr 24 23:34:20.815488 systemd[1]: modprobe@drm.service: Deactivated successfully. Apr 24 23:34:20.815631 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Apr 24 23:34:20.816662 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Apr 24 23:34:20.816785 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Apr 24 23:34:20.818802 systemd[1]: modprobe@loop.service: Deactivated successfully. Apr 24 23:34:20.819002 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Apr 24 23:34:20.827932 systemd[1]: Finished ensure-sysext.service. Apr 24 23:34:20.830984 systemd[1]: Finished systemd-update-done.service - Update is Completed. Apr 24 23:34:20.846383 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Apr 24 23:34:20.847055 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Apr 24 23:34:20.847153 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Apr 24 23:34:20.849952 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Apr 24 23:34:20.853313 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Apr 24 23:34:20.869106 lvm[1423]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 24 23:34:20.873067 systemd[1]: Started systemd-userdbd.service - User Database Manager. Apr 24 23:34:20.897253 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Apr 24 23:34:20.898177 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Apr 24 23:34:20.912460 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Apr 24 23:34:20.922987 lvm[1433]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Apr 24 23:34:20.948186 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Apr 24 23:34:20.957197 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Apr 24 23:34:20.975537 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Apr 24 23:34:20.977204 systemd[1]: Reached target time-set.target - System Time Set. Apr 24 23:34:20.985651 systemd-networkd[1386]: lo: Link UP Apr 24 23:34:20.985662 systemd-networkd[1386]: lo: Gained carrier Apr 24 23:34:20.987572 systemd-networkd[1386]: Enumeration completed Apr 24 23:34:20.987707 systemd[1]: Started systemd-networkd.service - Network Configuration. Apr 24 23:34:20.990422 systemd-networkd[1386]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:34:20.990436 systemd-networkd[1386]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 24 23:34:20.991868 systemd-networkd[1386]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:34:20.991879 systemd-networkd[1386]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Apr 24 23:34:20.992597 systemd-networkd[1386]: eth0: Link UP Apr 24 23:34:20.992609 systemd-networkd[1386]: eth0: Gained carrier Apr 24 23:34:20.992626 systemd-networkd[1386]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:34:20.994393 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Apr 24 23:34:21.000290 systemd-networkd[1386]: eth1: Link UP Apr 24 23:34:21.000302 systemd-networkd[1386]: eth1: Gained carrier Apr 24 23:34:21.000323 systemd-networkd[1386]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Apr 24 23:34:21.001317 systemd-resolved[1390]: Positive Trust Anchors: Apr 24 23:34:21.001334 systemd-resolved[1390]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Apr 24 23:34:21.001366 systemd-resolved[1390]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Apr 24 23:34:21.006588 systemd-resolved[1390]: Using system hostname 'ci-4081-3-6-n-0494d1f24d'. Apr 24 23:34:21.008693 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Apr 24 23:34:21.009625 systemd[1]: Reached target network.target - Network. Apr 24 23:34:21.010231 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Apr 24 23:34:21.010868 systemd[1]: Reached target sysinit.target - System Initialization. Apr 24 23:34:21.011646 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Apr 24 23:34:21.012419 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Apr 24 23:34:21.013337 systemd[1]: Started logrotate.timer - Daily rotation of log files. Apr 24 23:34:21.014045 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Apr 24 23:34:21.014873 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Apr 24 23:34:21.015601 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Apr 24 23:34:21.015638 systemd[1]: Reached target paths.target - Path Units. Apr 24 23:34:21.016131 systemd[1]: Reached target timers.target - Timer Units. Apr 24 23:34:21.017957 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Apr 24 23:34:21.021862 systemd[1]: Starting docker.socket - Docker Socket for the API... Apr 24 23:34:21.029187 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Apr 24 23:34:21.030576 systemd[1]: Listening on docker.socket - Docker Socket for the API. Apr 24 23:34:21.031491 systemd[1]: Reached target sockets.target - Socket Units. Apr 24 23:34:21.032157 systemd[1]: Reached target basic.target - Basic System. Apr 24 23:34:21.032914 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Apr 24 23:34:21.032954 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Apr 24 23:34:21.040359 systemd[1]: Starting containerd.service - containerd container runtime... Apr 24 23:34:21.046402 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Apr 24 23:34:21.057793 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Apr 24 23:34:21.060301 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Apr 24 23:34:21.063337 systemd-networkd[1386]: eth0: DHCPv4 address 178.105.28.58/32, gateway 172.31.1.1 acquired from 172.31.1.1 Apr 24 23:34:21.065030 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Apr 24 23:34:21.065688 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Apr 24 23:34:21.068242 systemd-timesyncd[1424]: Network configuration changed, trying to establish connection. Apr 24 23:34:21.069354 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Apr 24 23:34:21.079296 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Apr 24 23:34:21.083340 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Apr 24 23:34:21.085558 jq[1446]: false Apr 24 23:34:21.087258 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Apr 24 23:34:21.092443 systemd-networkd[1386]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Apr 24 23:34:21.093103 systemd-timesyncd[1424]: Network configuration changed, trying to establish connection. Apr 24 23:34:21.096911 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Apr 24 23:34:21.103915 dbus-daemon[1445]: [system] SELinux support is enabled Apr 24 23:34:21.104437 systemd[1]: Starting systemd-logind.service - User Login Management... Apr 24 23:34:21.107016 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Apr 24 23:34:21.108327 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Apr 24 23:34:21.109135 systemd[1]: Starting update-engine.service - Update Engine... Apr 24 23:34:21.115091 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Apr 24 23:34:21.116632 systemd[1]: Started dbus.service - D-Bus System Message Bus. Apr 24 23:34:21.131779 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Apr 24 23:34:21.133732 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Apr 24 23:34:21.136149 coreos-metadata[1444]: Apr 24 23:34:21.135 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Apr 24 23:34:21.139650 coreos-metadata[1444]: Apr 24 23:34:21.139 INFO Fetch successful Apr 24 23:34:21.139650 coreos-metadata[1444]: Apr 24 23:34:21.139 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Apr 24 23:34:21.141264 coreos-metadata[1444]: Apr 24 23:34:21.139 INFO Fetch successful Apr 24 23:34:21.145361 (ntainerd)[1462]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Apr 24 23:34:21.149622 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Apr 24 23:34:21.149675 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Apr 24 23:34:21.151238 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Apr 24 23:34:21.151267 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Apr 24 23:34:21.164199 jq[1458]: true Apr 24 23:34:21.191488 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Apr 24 23:34:21.191683 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Apr 24 23:34:21.220998 extend-filesystems[1447]: Found loop4 Apr 24 23:34:21.226192 extend-filesystems[1447]: Found loop5 Apr 24 23:34:21.226192 extend-filesystems[1447]: Found loop6 Apr 24 23:34:21.226192 extend-filesystems[1447]: Found loop7 Apr 24 23:34:21.226192 extend-filesystems[1447]: Found sda Apr 24 23:34:21.226192 extend-filesystems[1447]: Found sda1 Apr 24 23:34:21.226192 extend-filesystems[1447]: Found sda2 Apr 24 23:34:21.226192 extend-filesystems[1447]: Found sda3 Apr 24 23:34:21.226192 extend-filesystems[1447]: Found usr Apr 24 23:34:21.226192 extend-filesystems[1447]: Found sda4 Apr 24 23:34:21.226192 extend-filesystems[1447]: Found sda6 Apr 24 23:34:21.226192 extend-filesystems[1447]: Found sda7 Apr 24 23:34:21.226192 extend-filesystems[1447]: Found sda9 Apr 24 23:34:21.226192 extend-filesystems[1447]: Checking size of /dev/sda9 Apr 24 23:34:21.246881 jq[1473]: true Apr 24 23:34:21.246492 systemd[1]: motdgen.service: Deactivated successfully. Apr 24 23:34:21.247060 tar[1461]: linux-arm64/LICENSE Apr 24 23:34:21.247060 tar[1461]: linux-arm64/helm Apr 24 23:34:21.246703 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Apr 24 23:34:21.279736 update_engine[1457]: I20260424 23:34:21.270955 1457 main.cc:92] Flatcar Update Engine starting Apr 24 23:34:21.288447 systemd[1]: Started update-engine.service - Update Engine. Apr 24 23:34:21.292085 update_engine[1457]: I20260424 23:34:21.292014 1457 update_check_scheduler.cc:74] Next update check in 6m0s Apr 24 23:34:21.296379 systemd[1]: Started locksmithd.service - Cluster reboot manager. Apr 24 23:34:21.299764 extend-filesystems[1447]: Resized partition /dev/sda9 Apr 24 23:34:21.306660 extend-filesystems[1503]: resize2fs 1.47.1 (20-May-2024) Apr 24 23:34:21.323129 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Apr 24 23:34:21.380607 systemd-logind[1455]: New seat seat0. Apr 24 23:34:21.384959 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Apr 24 23:34:21.427435 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1286) Apr 24 23:34:21.440022 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Apr 24 23:34:21.441432 systemd-logind[1455]: Watching system buttons on /dev/input/event0 (Power Button) Apr 24 23:34:21.441454 systemd-logind[1455]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Apr 24 23:34:21.442394 systemd[1]: Started systemd-logind.service - User Login Management. Apr 24 23:34:21.453461 bash[1516]: Updated "/home/core/.ssh/authorized_keys" Apr 24 23:34:21.461426 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Apr 24 23:34:21.475263 systemd[1]: Starting sshkeys.service... Apr 24 23:34:21.522707 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Apr 24 23:34:21.534711 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Apr 24 23:34:21.550190 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Apr 24 23:34:21.585535 coreos-metadata[1521]: Apr 24 23:34:21.585 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Apr 24 23:34:21.589520 coreos-metadata[1521]: Apr 24 23:34:21.589 INFO Fetch successful Apr 24 23:34:21.591911 extend-filesystems[1503]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Apr 24 23:34:21.591911 extend-filesystems[1503]: old_desc_blocks = 1, new_desc_blocks = 5 Apr 24 23:34:21.591911 extend-filesystems[1503]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Apr 24 23:34:21.600111 extend-filesystems[1447]: Resized filesystem in /dev/sda9 Apr 24 23:34:21.600111 extend-filesystems[1447]: Found sr0 Apr 24 23:34:21.600737 unknown[1521]: wrote ssh authorized keys file for user: core Apr 24 23:34:21.601768 systemd[1]: extend-filesystems.service: Deactivated successfully. Apr 24 23:34:21.604208 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Apr 24 23:34:21.644138 update-ssh-keys[1527]: Updated "/home/core/.ssh/authorized_keys" Apr 24 23:34:21.647298 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Apr 24 23:34:21.653729 systemd[1]: Finished sshkeys.service. Apr 24 23:34:21.669775 containerd[1462]: time="2026-04-24T23:34:21.669674560Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Apr 24 23:34:21.718297 locksmithd[1499]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Apr 24 23:34:21.740150 containerd[1462]: time="2026-04-24T23:34:21.739696400Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:34:21.745144 containerd[1462]: time="2026-04-24T23:34:21.743091560Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:34:21.745144 containerd[1462]: time="2026-04-24T23:34:21.743160440Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Apr 24 23:34:21.745144 containerd[1462]: time="2026-04-24T23:34:21.743181440Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Apr 24 23:34:21.745144 containerd[1462]: time="2026-04-24T23:34:21.743374160Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Apr 24 23:34:21.745144 containerd[1462]: time="2026-04-24T23:34:21.743393400Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Apr 24 23:34:21.745144 containerd[1462]: time="2026-04-24T23:34:21.743457240Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:34:21.745144 containerd[1462]: time="2026-04-24T23:34:21.743469520Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:34:21.745144 containerd[1462]: time="2026-04-24T23:34:21.743664880Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:34:21.745144 containerd[1462]: time="2026-04-24T23:34:21.743680840Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Apr 24 23:34:21.745144 containerd[1462]: time="2026-04-24T23:34:21.743694280Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:34:21.745144 containerd[1462]: time="2026-04-24T23:34:21.743704560Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Apr 24 23:34:21.745521 containerd[1462]: time="2026-04-24T23:34:21.743784640Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:34:21.745521 containerd[1462]: time="2026-04-24T23:34:21.744069440Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Apr 24 23:34:21.747335 containerd[1462]: time="2026-04-24T23:34:21.747291120Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Apr 24 23:34:21.747826 containerd[1462]: time="2026-04-24T23:34:21.747411840Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Apr 24 23:34:21.747826 containerd[1462]: time="2026-04-24T23:34:21.747568600Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Apr 24 23:34:21.747826 containerd[1462]: time="2026-04-24T23:34:21.747622400Z" level=info msg="metadata content store policy set" policy=shared Apr 24 23:34:21.756485 containerd[1462]: time="2026-04-24T23:34:21.756439360Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Apr 24 23:34:21.756663 containerd[1462]: time="2026-04-24T23:34:21.756648920Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Apr 24 23:34:21.756764 containerd[1462]: time="2026-04-24T23:34:21.756749800Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Apr 24 23:34:21.757181 containerd[1462]: time="2026-04-24T23:34:21.757159400Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Apr 24 23:34:21.757259 containerd[1462]: time="2026-04-24T23:34:21.757246400Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Apr 24 23:34:21.757531 containerd[1462]: time="2026-04-24T23:34:21.757510680Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758533600Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758693800Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758710360Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758725560Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758739720Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758753520Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758769920Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758787280Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758804720Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758836480Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758852960Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758869040Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758893560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760148 containerd[1462]: time="2026-04-24T23:34:21.758908120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760495 containerd[1462]: time="2026-04-24T23:34:21.758921840Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760495 containerd[1462]: time="2026-04-24T23:34:21.758936440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760495 containerd[1462]: time="2026-04-24T23:34:21.758963280Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760495 containerd[1462]: time="2026-04-24T23:34:21.758986240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760495 containerd[1462]: time="2026-04-24T23:34:21.758998920Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760495 containerd[1462]: time="2026-04-24T23:34:21.759013200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760495 containerd[1462]: time="2026-04-24T23:34:21.759037120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760495 containerd[1462]: time="2026-04-24T23:34:21.759052160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760495 containerd[1462]: time="2026-04-24T23:34:21.759064240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760495 containerd[1462]: time="2026-04-24T23:34:21.759076400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.760495 containerd[1462]: time="2026-04-24T23:34:21.759090200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.762159 containerd[1462]: time="2026-04-24T23:34:21.759109840Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Apr 24 23:34:21.762259 containerd[1462]: time="2026-04-24T23:34:21.762238160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.765876 containerd[1462]: time="2026-04-24T23:34:21.764145440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.765876 containerd[1462]: time="2026-04-24T23:34:21.764216320Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Apr 24 23:34:21.765876 containerd[1462]: time="2026-04-24T23:34:21.764390800Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Apr 24 23:34:21.765876 containerd[1462]: time="2026-04-24T23:34:21.764411200Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Apr 24 23:34:21.765876 containerd[1462]: time="2026-04-24T23:34:21.764424920Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Apr 24 23:34:21.765876 containerd[1462]: time="2026-04-24T23:34:21.764439120Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Apr 24 23:34:21.765876 containerd[1462]: time="2026-04-24T23:34:21.764449120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.765876 containerd[1462]: time="2026-04-24T23:34:21.764461880Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Apr 24 23:34:21.765876 containerd[1462]: time="2026-04-24T23:34:21.764473400Z" level=info msg="NRI interface is disabled by configuration." Apr 24 23:34:21.765876 containerd[1462]: time="2026-04-24T23:34:21.764484680Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Apr 24 23:34:21.766221 containerd[1462]: time="2026-04-24T23:34:21.765066000Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Apr 24 23:34:21.766221 containerd[1462]: time="2026-04-24T23:34:21.765186600Z" level=info msg="Connect containerd service" Apr 24 23:34:21.766221 containerd[1462]: time="2026-04-24T23:34:21.765234040Z" level=info msg="using legacy CRI server" Apr 24 23:34:21.766221 containerd[1462]: time="2026-04-24T23:34:21.765242280Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Apr 24 23:34:21.766221 containerd[1462]: time="2026-04-24T23:34:21.765479760Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Apr 24 23:34:21.772318 containerd[1462]: time="2026-04-24T23:34:21.772257880Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 24 23:34:21.772684 containerd[1462]: time="2026-04-24T23:34:21.772639320Z" level=info msg="Start subscribing containerd event" Apr 24 23:34:21.772766 containerd[1462]: time="2026-04-24T23:34:21.772754400Z" level=info msg="Start recovering state" Apr 24 23:34:21.772908 containerd[1462]: time="2026-04-24T23:34:21.772881200Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Apr 24 23:34:21.772942 containerd[1462]: time="2026-04-24T23:34:21.772931840Z" level=info msg=serving... address=/run/containerd/containerd.sock Apr 24 23:34:21.773012 containerd[1462]: time="2026-04-24T23:34:21.772896520Z" level=info msg="Start event monitor" Apr 24 23:34:21.773064 containerd[1462]: time="2026-04-24T23:34:21.773053760Z" level=info msg="Start snapshots syncer" Apr 24 23:34:21.773144 containerd[1462]: time="2026-04-24T23:34:21.773109800Z" level=info msg="Start cni network conf syncer for default" Apr 24 23:34:21.773191 containerd[1462]: time="2026-04-24T23:34:21.773179120Z" level=info msg="Start streaming server" Apr 24 23:34:21.773415 containerd[1462]: time="2026-04-24T23:34:21.773398920Z" level=info msg="containerd successfully booted in 0.104695s" Apr 24 23:34:21.773642 systemd[1]: Started containerd.service - containerd container runtime. Apr 24 23:34:22.017736 tar[1461]: linux-arm64/README.md Apr 24 23:34:22.030087 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Apr 24 23:34:22.080315 sshd_keygen[1475]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Apr 24 23:34:22.103742 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Apr 24 23:34:22.114950 systemd[1]: Starting issuegen.service - Generate /run/issue... Apr 24 23:34:22.125055 systemd[1]: issuegen.service: Deactivated successfully. Apr 24 23:34:22.125329 systemd[1]: Finished issuegen.service - Generate /run/issue. Apr 24 23:34:22.131705 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Apr 24 23:34:22.145528 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Apr 24 23:34:22.164759 systemd[1]: Started getty@tty1.service - Getty on tty1. Apr 24 23:34:22.169057 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Apr 24 23:34:22.170246 systemd[1]: Reached target getty.target - Login Prompts. Apr 24 23:34:22.746358 systemd-networkd[1386]: eth1: Gained IPv6LL Apr 24 23:34:22.747193 systemd-timesyncd[1424]: Network configuration changed, trying to establish connection. Apr 24 23:34:22.753154 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Apr 24 23:34:22.754709 systemd[1]: Reached target network-online.target - Network is Online. Apr 24 23:34:22.763463 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:34:22.767884 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Apr 24 23:34:22.798380 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Apr 24 23:34:22.810706 systemd-networkd[1386]: eth0: Gained IPv6LL Apr 24 23:34:22.811436 systemd-timesyncd[1424]: Network configuration changed, trying to establish connection. Apr 24 23:34:23.591468 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:34:23.593397 systemd[1]: Reached target multi-user.target - Multi-User System. Apr 24 23:34:23.595105 systemd[1]: Startup finished in 826ms (kernel) + 5.807s (initrd) + 5.127s (userspace) = 11.760s. Apr 24 23:34:23.605182 (kubelet)[1573]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 24 23:34:24.105023 kubelet[1573]: E0424 23:34:24.104913 1573 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 24 23:34:24.108179 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 24 23:34:24.108377 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 24 23:34:34.166199 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Apr 24 23:34:34.179574 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:34:34.314311 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:34:34.321823 (kubelet)[1592]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 24 23:34:34.374965 kubelet[1592]: E0424 23:34:34.374872 1592 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 24 23:34:34.379601 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 24 23:34:34.379894 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 24 23:34:44.416363 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Apr 24 23:34:44.429976 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:34:44.570187 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:34:44.584634 (kubelet)[1608]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 24 23:34:44.626792 kubelet[1608]: E0424 23:34:44.626655 1608 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 24 23:34:44.630316 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 24 23:34:44.630645 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 24 23:34:52.967917 systemd-timesyncd[1424]: Contacted time server 129.70.132.37:123 (2.flatcar.pool.ntp.org). Apr 24 23:34:52.968032 systemd-timesyncd[1424]: Initial clock synchronization to Fri 2026-04-24 23:34:53.224019 UTC. Apr 24 23:34:54.667701 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Apr 24 23:34:54.677522 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:34:54.828551 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:34:54.828595 (kubelet)[1624]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 24 23:34:54.877908 kubelet[1624]: E0424 23:34:54.877850 1624 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 24 23:34:54.881623 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 24 23:34:54.881888 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 24 23:34:57.408252 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Apr 24 23:34:57.413774 systemd[1]: Started sshd@0-178.105.28.58:22-50.85.169.122:38598.service - OpenSSH per-connection server daemon (50.85.169.122:38598). Apr 24 23:34:57.558463 sshd[1632]: Accepted publickey for core from 50.85.169.122 port 38598 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:34:57.561517 sshd[1632]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:34:57.575951 systemd-logind[1455]: New session 1 of user core. Apr 24 23:34:57.577611 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Apr 24 23:34:57.584687 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Apr 24 23:34:57.599992 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Apr 24 23:34:57.608672 systemd[1]: Starting user@500.service - User Manager for UID 500... Apr 24 23:34:57.619213 (systemd)[1636]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Apr 24 23:34:57.737269 systemd[1636]: Queued start job for default target default.target. Apr 24 23:34:57.752602 systemd[1636]: Created slice app.slice - User Application Slice. Apr 24 23:34:57.752673 systemd[1636]: Reached target paths.target - Paths. Apr 24 23:34:57.752703 systemd[1636]: Reached target timers.target - Timers. Apr 24 23:34:57.757480 systemd[1636]: Starting dbus.socket - D-Bus User Message Bus Socket... Apr 24 23:34:57.772791 systemd[1636]: Listening on dbus.socket - D-Bus User Message Bus Socket. Apr 24 23:34:57.772921 systemd[1636]: Reached target sockets.target - Sockets. Apr 24 23:34:57.772934 systemd[1636]: Reached target basic.target - Basic System. Apr 24 23:34:57.772977 systemd[1636]: Reached target default.target - Main User Target. Apr 24 23:34:57.773004 systemd[1636]: Startup finished in 145ms. Apr 24 23:34:57.773369 systemd[1]: Started user@500.service - User Manager for UID 500. Apr 24 23:34:57.785531 systemd[1]: Started session-1.scope - Session 1 of User core. Apr 24 23:34:57.921359 systemd[1]: Started sshd@1-178.105.28.58:22-50.85.169.122:38614.service - OpenSSH per-connection server daemon (50.85.169.122:38614). Apr 24 23:34:58.051269 sshd[1647]: Accepted publickey for core from 50.85.169.122 port 38614 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:34:58.052989 sshd[1647]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:34:58.059301 systemd-logind[1455]: New session 2 of user core. Apr 24 23:34:58.066512 systemd[1]: Started session-2.scope - Session 2 of User core. Apr 24 23:34:58.170844 sshd[1647]: pam_unix(sshd:session): session closed for user core Apr 24 23:34:58.175466 systemd[1]: sshd@1-178.105.28.58:22-50.85.169.122:38614.service: Deactivated successfully. Apr 24 23:34:58.177592 systemd[1]: session-2.scope: Deactivated successfully. Apr 24 23:34:58.178899 systemd-logind[1455]: Session 2 logged out. Waiting for processes to exit. Apr 24 23:34:58.180286 systemd-logind[1455]: Removed session 2. Apr 24 23:34:58.197931 systemd[1]: Started sshd@2-178.105.28.58:22-50.85.169.122:38624.service - OpenSSH per-connection server daemon (50.85.169.122:38624). Apr 24 23:34:58.329282 sshd[1654]: Accepted publickey for core from 50.85.169.122 port 38624 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:34:58.331245 sshd[1654]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:34:58.338868 systemd-logind[1455]: New session 3 of user core. Apr 24 23:34:58.345422 systemd[1]: Started session-3.scope - Session 3 of User core. Apr 24 23:34:58.444278 sshd[1654]: pam_unix(sshd:session): session closed for user core Apr 24 23:34:58.451760 systemd[1]: sshd@2-178.105.28.58:22-50.85.169.122:38624.service: Deactivated successfully. Apr 24 23:34:58.457111 systemd[1]: session-3.scope: Deactivated successfully. Apr 24 23:34:58.458851 systemd-logind[1455]: Session 3 logged out. Waiting for processes to exit. Apr 24 23:34:58.475040 systemd-logind[1455]: Removed session 3. Apr 24 23:34:58.480876 systemd[1]: Started sshd@3-178.105.28.58:22-50.85.169.122:38626.service - OpenSSH per-connection server daemon (50.85.169.122:38626). Apr 24 23:34:58.618736 sshd[1661]: Accepted publickey for core from 50.85.169.122 port 38626 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:34:58.621694 sshd[1661]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:34:58.627552 systemd-logind[1455]: New session 4 of user core. Apr 24 23:34:58.638491 systemd[1]: Started session-4.scope - Session 4 of User core. Apr 24 23:34:58.742823 sshd[1661]: pam_unix(sshd:session): session closed for user core Apr 24 23:34:58.749045 systemd[1]: sshd@3-178.105.28.58:22-50.85.169.122:38626.service: Deactivated successfully. Apr 24 23:34:58.753218 systemd[1]: session-4.scope: Deactivated successfully. Apr 24 23:34:58.753888 systemd-logind[1455]: Session 4 logged out. Waiting for processes to exit. Apr 24 23:34:58.754977 systemd-logind[1455]: Removed session 4. Apr 24 23:34:58.776612 systemd[1]: Started sshd@4-178.105.28.58:22-50.85.169.122:38632.service - OpenSSH per-connection server daemon (50.85.169.122:38632). Apr 24 23:34:58.895804 sshd[1668]: Accepted publickey for core from 50.85.169.122 port 38632 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:34:58.897059 sshd[1668]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:34:58.903670 systemd-logind[1455]: New session 5 of user core. Apr 24 23:34:58.909518 systemd[1]: Started session-5.scope - Session 5 of User core. Apr 24 23:34:59.003419 sudo[1671]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Apr 24 23:34:59.003708 sudo[1671]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:34:59.019807 sudo[1671]: pam_unix(sudo:session): session closed for user root Apr 24 23:34:59.036896 sshd[1668]: pam_unix(sshd:session): session closed for user core Apr 24 23:34:59.045036 systemd[1]: sshd@4-178.105.28.58:22-50.85.169.122:38632.service: Deactivated successfully. Apr 24 23:34:59.047037 systemd[1]: session-5.scope: Deactivated successfully. Apr 24 23:34:59.050532 systemd-logind[1455]: Session 5 logged out. Waiting for processes to exit. Apr 24 23:34:59.052304 systemd-logind[1455]: Removed session 5. Apr 24 23:34:59.067439 systemd[1]: Started sshd@5-178.105.28.58:22-50.85.169.122:38634.service - OpenSSH per-connection server daemon (50.85.169.122:38634). Apr 24 23:34:59.206206 sshd[1676]: Accepted publickey for core from 50.85.169.122 port 38634 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:34:59.208534 sshd[1676]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:34:59.216116 systemd-logind[1455]: New session 6 of user core. Apr 24 23:34:59.222038 systemd[1]: Started session-6.scope - Session 6 of User core. Apr 24 23:34:59.311400 sudo[1680]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Apr 24 23:34:59.311859 sudo[1680]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:34:59.315849 sudo[1680]: pam_unix(sudo:session): session closed for user root Apr 24 23:34:59.321979 sudo[1679]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Apr 24 23:34:59.322306 sudo[1679]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:34:59.344346 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Apr 24 23:34:59.346189 auditctl[1683]: No rules Apr 24 23:34:59.346785 systemd[1]: audit-rules.service: Deactivated successfully. Apr 24 23:34:59.347017 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Apr 24 23:34:59.350432 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Apr 24 23:34:59.393525 augenrules[1701]: No rules Apr 24 23:34:59.395394 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Apr 24 23:34:59.396973 sudo[1679]: pam_unix(sudo:session): session closed for user root Apr 24 23:34:59.414557 sshd[1676]: pam_unix(sshd:session): session closed for user core Apr 24 23:34:59.421320 systemd[1]: sshd@5-178.105.28.58:22-50.85.169.122:38634.service: Deactivated successfully. Apr 24 23:34:59.425492 systemd[1]: session-6.scope: Deactivated successfully. Apr 24 23:34:59.427788 systemd-logind[1455]: Session 6 logged out. Waiting for processes to exit. Apr 24 23:34:59.429243 systemd-logind[1455]: Removed session 6. Apr 24 23:34:59.450633 systemd[1]: Started sshd@6-178.105.28.58:22-50.85.169.122:39168.service - OpenSSH per-connection server daemon (50.85.169.122:39168). Apr 24 23:34:59.577430 sshd[1709]: Accepted publickey for core from 50.85.169.122 port 39168 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:34:59.582843 sshd[1709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:34:59.593993 systemd-logind[1455]: New session 7 of user core. Apr 24 23:34:59.597699 systemd[1]: Started session-7.scope - Session 7 of User core. Apr 24 23:34:59.686085 sudo[1712]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Apr 24 23:34:59.686401 sudo[1712]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Apr 24 23:34:59.998627 systemd[1]: Starting docker.service - Docker Application Container Engine... Apr 24 23:35:00.000522 (dockerd)[1727]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Apr 24 23:35:00.259231 dockerd[1727]: time="2026-04-24T23:35:00.259037397Z" level=info msg="Starting up" Apr 24 23:35:00.354002 dockerd[1727]: time="2026-04-24T23:35:00.353707949Z" level=info msg="Loading containers: start." Apr 24 23:35:00.452175 kernel: Initializing XFRM netlink socket Apr 24 23:35:00.538811 systemd-networkd[1386]: docker0: Link UP Apr 24 23:35:00.562210 dockerd[1727]: time="2026-04-24T23:35:00.562104746Z" level=info msg="Loading containers: done." Apr 24 23:35:00.576418 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2018300284-merged.mount: Deactivated successfully. Apr 24 23:35:00.579938 dockerd[1727]: time="2026-04-24T23:35:00.579862711Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Apr 24 23:35:00.580040 dockerd[1727]: time="2026-04-24T23:35:00.580004190Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Apr 24 23:35:00.580253 dockerd[1727]: time="2026-04-24T23:35:00.580189574Z" level=info msg="Daemon has completed initialization" Apr 24 23:35:00.634448 dockerd[1727]: time="2026-04-24T23:35:00.634182811Z" level=info msg="API listen on /run/docker.sock" Apr 24 23:35:00.634807 systemd[1]: Started docker.service - Docker Application Container Engine. Apr 24 23:35:01.114902 containerd[1462]: time="2026-04-24T23:35:01.114711673Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.4\"" Apr 24 23:35:01.705657 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1305199811.mount: Deactivated successfully. Apr 24 23:35:02.878142 containerd[1462]: time="2026-04-24T23:35:02.877999595Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:02.880450 containerd[1462]: time="2026-04-24T23:35:02.880367995Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.4: active requests=0, bytes read=24608883" Apr 24 23:35:02.883292 containerd[1462]: time="2026-04-24T23:35:02.881693807Z" level=info msg="ImageCreate event name:\"sha256:09c946ff1743c56c0d49ef90ba95500741e0534f2f590ec98c924e4673ee3096\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:02.886611 containerd[1462]: time="2026-04-24T23:35:02.886547483Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:06b4bb208634a107ab9e6c50cdb9df178d05166a700c0cc448d59522091074b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:02.888842 containerd[1462]: time="2026-04-24T23:35:02.888789720Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.4\" with image id \"sha256:09c946ff1743c56c0d49ef90ba95500741e0534f2f590ec98c924e4673ee3096\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:06b4bb208634a107ab9e6c50cdb9df178d05166a700c0cc448d59522091074b5\", size \"24605384\" in 1.774028928s" Apr 24 23:35:02.889024 containerd[1462]: time="2026-04-24T23:35:02.889003884Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.4\" returns image reference \"sha256:09c946ff1743c56c0d49ef90ba95500741e0534f2f590ec98c924e4673ee3096\"" Apr 24 23:35:02.889896 containerd[1462]: time="2026-04-24T23:35:02.889871846Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.4\"" Apr 24 23:35:04.063211 containerd[1462]: time="2026-04-24T23:35:04.063136822Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:04.065207 containerd[1462]: time="2026-04-24T23:35:04.065160359Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.4: active requests=0, bytes read=19073314" Apr 24 23:35:04.065977 containerd[1462]: time="2026-04-24T23:35:04.065677828Z" level=info msg="ImageCreate event name:\"sha256:95ce7d322e267614405a2a0eccfc0a1bdf5664dd9ab089bdfa9ae74d5ccb05a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:04.071161 containerd[1462]: time="2026-04-24T23:35:04.069296486Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7b036c805d57f203e9efaf43672cff6019b9083a9c0eb107ea8500eace29d8fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:04.071161 containerd[1462]: time="2026-04-24T23:35:04.071048716Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.4\" with image id \"sha256:95ce7d322e267614405a2a0eccfc0a1bdf5664dd9ab089bdfa9ae74d5ccb05a7\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7b036c805d57f203e9efaf43672cff6019b9083a9c0eb107ea8500eace29d8fd\", size \"20579933\" in 1.181014029s" Apr 24 23:35:04.071161 containerd[1462]: time="2026-04-24T23:35:04.071096869Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.4\" returns image reference \"sha256:95ce7d322e267614405a2a0eccfc0a1bdf5664dd9ab089bdfa9ae74d5ccb05a7\"" Apr 24 23:35:04.072383 containerd[1462]: time="2026-04-24T23:35:04.072350488Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.4\"" Apr 24 23:35:04.901164 containerd[1462]: time="2026-04-24T23:35:04.899466069Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:04.901487 containerd[1462]: time="2026-04-24T23:35:04.901414980Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.4: active requests=0, bytes read=13800856" Apr 24 23:35:04.902757 containerd[1462]: time="2026-04-24T23:35:04.902712158Z" level=info msg="ImageCreate event name:\"sha256:77d7d4cb9aa826105b6410a50df1dda7462ec663ced995347d8c171b04b0ee81\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:04.907671 containerd[1462]: time="2026-04-24T23:35:04.907628276Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9054fecb4fa04cc63aec47b0913c8deb3487d414190cd15211f864cfe0d0b4d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:04.908994 containerd[1462]: time="2026-04-24T23:35:04.908948785Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.4\" with image id \"sha256:77d7d4cb9aa826105b6410a50df1dda7462ec663ced995347d8c171b04b0ee81\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9054fecb4fa04cc63aec47b0913c8deb3487d414190cd15211f864cfe0d0b4d6\", size \"15307493\" in 836.559332ms" Apr 24 23:35:04.908994 containerd[1462]: time="2026-04-24T23:35:04.908988798Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.4\" returns image reference \"sha256:77d7d4cb9aa826105b6410a50df1dda7462ec663ced995347d8c171b04b0ee81\"" Apr 24 23:35:04.909499 containerd[1462]: time="2026-04-24T23:35:04.909470284Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.4\"" Apr 24 23:35:04.914947 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Apr 24 23:35:04.924959 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:35:05.058624 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:35:05.070567 (kubelet)[1942]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Apr 24 23:35:05.130477 kubelet[1942]: E0424 23:35:05.130421 1942 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Apr 24 23:35:05.133393 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Apr 24 23:35:05.133541 systemd[1]: kubelet.service: Failed with result 'exit-code'. Apr 24 23:35:05.825450 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1995635015.mount: Deactivated successfully. Apr 24 23:35:06.062014 containerd[1462]: time="2026-04-24T23:35:06.061206703Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:06.063680 containerd[1462]: time="2026-04-24T23:35:06.063637716Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.4: active requests=0, bytes read=22340610" Apr 24 23:35:06.065347 containerd[1462]: time="2026-04-24T23:35:06.065311355Z" level=info msg="ImageCreate event name:\"sha256:8c75fb69e773da539298848d12a0a12029818ee910a62f2abd68aa1a5805991c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:06.075986 containerd[1462]: time="2026-04-24T23:35:06.074407276Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:c5daa23c72474e5e4062c320177d3b485fd42e7010f052bc80d657c4c00a0672\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:06.084270 containerd[1462]: time="2026-04-24T23:35:06.080183233Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.4\" with image id \"sha256:8c75fb69e773da539298848d12a0a12029818ee910a62f2abd68aa1a5805991c\", repo tag \"registry.k8s.io/kube-proxy:v1.35.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:c5daa23c72474e5e4062c320177d3b485fd42e7010f052bc80d657c4c00a0672\", size \"22339603\" in 1.170669626s" Apr 24 23:35:06.084270 containerd[1462]: time="2026-04-24T23:35:06.080235205Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.4\" returns image reference \"sha256:8c75fb69e773da539298848d12a0a12029818ee910a62f2abd68aa1a5805991c\"" Apr 24 23:35:06.084270 containerd[1462]: time="2026-04-24T23:35:06.083253756Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Apr 24 23:35:06.620403 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1203791576.mount: Deactivated successfully. Apr 24 23:35:06.835859 update_engine[1457]: I20260424 23:35:06.835222 1457 update_attempter.cc:509] Updating boot flags... Apr 24 23:35:06.913166 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1978) Apr 24 23:35:06.979183 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1982) Apr 24 23:35:07.659516 containerd[1462]: time="2026-04-24T23:35:07.659437222Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:07.661408 containerd[1462]: time="2026-04-24T23:35:07.661350711Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=21172309" Apr 24 23:35:07.662406 containerd[1462]: time="2026-04-24T23:35:07.661736167Z" level=info msg="ImageCreate event name:\"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:07.667451 containerd[1462]: time="2026-04-24T23:35:07.667384984Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:07.669860 containerd[1462]: time="2026-04-24T23:35:07.669767700Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"21168808\" in 1.586434589s" Apr 24 23:35:07.669860 containerd[1462]: time="2026-04-24T23:35:07.669849462Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:e08f4d9d2e6ede8185064c13b41f8eeee95b609c0ca93b6fe7509fe527c907cf\"" Apr 24 23:35:07.670553 containerd[1462]: time="2026-04-24T23:35:07.670395990Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Apr 24 23:35:08.102975 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2414073960.mount: Deactivated successfully. Apr 24 23:35:08.114170 containerd[1462]: time="2026-04-24T23:35:08.113844193Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:08.115634 containerd[1462]: time="2026-04-24T23:35:08.115589748Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=268729" Apr 24 23:35:08.117801 containerd[1462]: time="2026-04-24T23:35:08.117740206Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:08.122234 containerd[1462]: time="2026-04-24T23:35:08.121337012Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:08.122234 containerd[1462]: time="2026-04-24T23:35:08.122064343Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 451.631579ms" Apr 24 23:35:08.122234 containerd[1462]: time="2026-04-24T23:35:08.122101624Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Apr 24 23:35:08.123302 containerd[1462]: time="2026-04-24T23:35:08.123270456Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Apr 24 23:35:08.620582 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3876585208.mount: Deactivated successfully. Apr 24 23:35:09.331921 containerd[1462]: time="2026-04-24T23:35:09.331838268Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:09.334371 containerd[1462]: time="2026-04-24T23:35:09.334078873Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=21752394" Apr 24 23:35:09.335935 containerd[1462]: time="2026-04-24T23:35:09.335341914Z" level=info msg="ImageCreate event name:\"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:09.339324 containerd[1462]: time="2026-04-24T23:35:09.339276383Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:09.340564 containerd[1462]: time="2026-04-24T23:35:09.340521918Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"21749640\" in 1.217106847s" Apr 24 23:35:09.340714 containerd[1462]: time="2026-04-24T23:35:09.340695051Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:271e49a0ebc56647476845128fcd2a73bb138beeca3878cc3bf52b4ff1172a57\"" Apr 24 23:35:12.866864 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:35:12.873776 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:35:12.910685 systemd[1]: Reloading requested from client PID 2122 ('systemctl') (unit session-7.scope)... Apr 24 23:35:12.910702 systemd[1]: Reloading... Apr 24 23:35:13.055151 zram_generator::config[2174]: No configuration found. Apr 24 23:35:13.141518 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:35:13.212397 systemd[1]: Reloading finished in 301 ms. Apr 24 23:35:13.274505 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Apr 24 23:35:13.274877 systemd[1]: kubelet.service: Failed with result 'signal'. Apr 24 23:35:13.275767 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:35:13.282535 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:35:13.442191 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:35:13.454519 (kubelet)[2210]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 24 23:35:13.495451 kubelet[2210]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:35:13.718000 kubelet[2210]: I0424 23:35:13.717776 2210 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Apr 24 23:35:13.718000 kubelet[2210]: I0424 23:35:13.717869 2210 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:35:13.718000 kubelet[2210]: I0424 23:35:13.717966 2210 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 24 23:35:13.718000 kubelet[2210]: I0424 23:35:13.717985 2210 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:35:13.718575 kubelet[2210]: I0424 23:35:13.718488 2210 server.go:951] "Client rotation is on, will bootstrap in background" Apr 24 23:35:13.728188 kubelet[2210]: E0424 23:35:13.728106 2210 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://178.105.28.58:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 178.105.28.58:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Apr 24 23:35:13.730166 kubelet[2210]: I0424 23:35:13.729456 2210 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 24 23:35:13.734663 kubelet[2210]: E0424 23:35:13.734614 2210 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 24 23:35:13.734890 kubelet[2210]: I0424 23:35:13.734874 2210 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Apr 24 23:35:13.737633 kubelet[2210]: I0424 23:35:13.737604 2210 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 24 23:35:13.738764 kubelet[2210]: I0424 23:35:13.738723 2210 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:35:13.739085 kubelet[2210]: I0424 23:35:13.738874 2210 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-0494d1f24d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:35:13.739255 kubelet[2210]: I0424 23:35:13.739240 2210 topology_manager.go:143] "Creating topology manager with none policy" Apr 24 23:35:13.739311 kubelet[2210]: I0424 23:35:13.739303 2210 container_manager_linux.go:308] "Creating device plugin manager" Apr 24 23:35:13.739470 kubelet[2210]: I0424 23:35:13.739458 2210 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Apr 24 23:35:13.741229 kubelet[2210]: I0424 23:35:13.741208 2210 state_mem.go:41] "Initialized" logger="CPUManager state memory" Apr 24 23:35:13.741600 kubelet[2210]: I0424 23:35:13.741586 2210 kubelet.go:482] "Attempting to sync node with API server" Apr 24 23:35:13.741699 kubelet[2210]: I0424 23:35:13.741663 2210 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:35:13.741699 kubelet[2210]: I0424 23:35:13.741690 2210 kubelet.go:394] "Adding apiserver pod source" Apr 24 23:35:13.741828 kubelet[2210]: I0424 23:35:13.741767 2210 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:35:13.746136 kubelet[2210]: I0424 23:35:13.745757 2210 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 24 23:35:13.747374 kubelet[2210]: I0424 23:35:13.747350 2210 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:35:13.747484 kubelet[2210]: I0424 23:35:13.747474 2210 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 24 23:35:13.748131 kubelet[2210]: W0424 23:35:13.747579 2210 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Apr 24 23:35:13.750854 kubelet[2210]: I0424 23:35:13.750836 2210 server.go:1257] "Started kubelet" Apr 24 23:35:13.751456 kubelet[2210]: I0424 23:35:13.751319 2210 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:35:13.752605 kubelet[2210]: I0424 23:35:13.752389 2210 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:35:13.756141 kubelet[2210]: I0424 23:35:13.755674 2210 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:35:13.756141 kubelet[2210]: I0424 23:35:13.755748 2210 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 24 23:35:13.756141 kubelet[2210]: I0424 23:35:13.756015 2210 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:35:13.757648 kubelet[2210]: E0424 23:35:13.756409 2210 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://178.105.28.58:6443/api/v1/namespaces/default/events\": dial tcp 178.105.28.58:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-n-0494d1f24d.18a96f2097cd0d0b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-n-0494d1f24d,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-0494d1f24d,},FirstTimestamp:2026-04-24 23:35:13.750805771 +0000 UTC m=+0.291867865,LastTimestamp:2026-04-24 23:35:13.750805771 +0000 UTC m=+0.291867865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-0494d1f24d,}" Apr 24 23:35:13.758717 kubelet[2210]: I0424 23:35:13.758686 2210 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Apr 24 23:35:13.759393 kubelet[2210]: I0424 23:35:13.759365 2210 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 24 23:35:13.763218 kubelet[2210]: E0424 23:35:13.762439 2210 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Apr 24 23:35:13.763218 kubelet[2210]: E0424 23:35:13.762606 2210 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-0494d1f24d\" not found" Apr 24 23:35:13.763218 kubelet[2210]: I0424 23:35:13.762630 2210 volume_manager.go:311] "Starting Kubelet Volume Manager" Apr 24 23:35:13.763218 kubelet[2210]: I0424 23:35:13.762823 2210 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 24 23:35:13.763218 kubelet[2210]: I0424 23:35:13.762874 2210 reconciler.go:29] "Reconciler: start to sync state" Apr 24 23:35:13.765480 kubelet[2210]: I0424 23:35:13.765458 2210 factory.go:223] Registration of the containerd container factory successfully Apr 24 23:35:13.765599 kubelet[2210]: I0424 23:35:13.765590 2210 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:35:13.765737 kubelet[2210]: I0424 23:35:13.765721 2210 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 24 23:35:13.786477 kubelet[2210]: I0424 23:35:13.786420 2210 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 24 23:35:13.787743 kubelet[2210]: I0424 23:35:13.787719 2210 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 24 23:35:13.787866 kubelet[2210]: I0424 23:35:13.787853 2210 status_manager.go:249] "Starting to sync pod status with apiserver" Apr 24 23:35:13.787947 kubelet[2210]: I0424 23:35:13.787937 2210 kubelet.go:2501] "Starting kubelet main sync loop" Apr 24 23:35:13.788060 kubelet[2210]: E0424 23:35:13.788039 2210 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 24 23:35:13.801282 kubelet[2210]: E0424 23:35:13.801245 2210 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://178.105.28.58:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-0494d1f24d?timeout=10s\": dial tcp 178.105.28.58:6443: connect: connection refused" interval="200ms" Apr 24 23:35:13.803149 kubelet[2210]: I0424 23:35:13.802526 2210 cpu_manager.go:225] "Starting" policy="none" Apr 24 23:35:13.803149 kubelet[2210]: I0424 23:35:13.802542 2210 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Apr 24 23:35:13.803149 kubelet[2210]: I0424 23:35:13.802560 2210 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Apr 24 23:35:13.805846 kubelet[2210]: I0424 23:35:13.805813 2210 policy_none.go:50] "Start" Apr 24 23:35:13.805846 kubelet[2210]: I0424 23:35:13.805840 2210 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 24 23:35:13.805846 kubelet[2210]: I0424 23:35:13.805854 2210 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 24 23:35:13.807489 kubelet[2210]: I0424 23:35:13.807465 2210 policy_none.go:44] "Start" Apr 24 23:35:13.811421 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Apr 24 23:35:13.828208 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Apr 24 23:35:13.832082 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Apr 24 23:35:13.845300 kubelet[2210]: E0424 23:35:13.845249 2210 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:35:13.845601 kubelet[2210]: I0424 23:35:13.845574 2210 eviction_manager.go:194] "Eviction manager: starting control loop" Apr 24 23:35:13.845697 kubelet[2210]: I0424 23:35:13.845603 2210 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:35:13.847456 kubelet[2210]: I0424 23:35:13.846588 2210 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Apr 24 23:35:13.848918 kubelet[2210]: E0424 23:35:13.848838 2210 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 24 23:35:13.848918 kubelet[2210]: E0424 23:35:13.848887 2210 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-6-n-0494d1f24d\" not found" Apr 24 23:35:13.905746 systemd[1]: Created slice kubepods-burstable-podd1482e8f637c7e8785ce3d7f8596b313.slice - libcontainer container kubepods-burstable-podd1482e8f637c7e8785ce3d7f8596b313.slice. Apr 24 23:35:13.915788 kubelet[2210]: E0424 23:35:13.915716 2210 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-0494d1f24d\" not found" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.919969 systemd[1]: Created slice kubepods-burstable-pod11b5bcd918dbbbfa727a9b00c6d1f792.slice - libcontainer container kubepods-burstable-pod11b5bcd918dbbbfa727a9b00c6d1f792.slice. Apr 24 23:35:13.930073 kubelet[2210]: E0424 23:35:13.929985 2210 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-0494d1f24d\" not found" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.934422 systemd[1]: Created slice kubepods-burstable-pod576488f2b4b68f1ff6fe63eb563ef59c.slice - libcontainer container kubepods-burstable-pod576488f2b4b68f1ff6fe63eb563ef59c.slice. Apr 24 23:35:13.936507 kubelet[2210]: E0424 23:35:13.936458 2210 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-0494d1f24d\" not found" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.949078 kubelet[2210]: I0424 23:35:13.949019 2210 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.949800 kubelet[2210]: E0424 23:35:13.949659 2210 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://178.105.28.58:6443/api/v1/nodes\": dial tcp 178.105.28.58:6443: connect: connection refused" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.964693 kubelet[2210]: I0424 23:35:13.964397 2210 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d1482e8f637c7e8785ce3d7f8596b313-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-0494d1f24d\" (UID: \"d1482e8f637c7e8785ce3d7f8596b313\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.964693 kubelet[2210]: I0424 23:35:13.964495 2210 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/11b5bcd918dbbbfa727a9b00c6d1f792-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" (UID: \"11b5bcd918dbbbfa727a9b00c6d1f792\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.964693 kubelet[2210]: I0424 23:35:13.964555 2210 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/11b5bcd918dbbbfa727a9b00c6d1f792-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" (UID: \"11b5bcd918dbbbfa727a9b00c6d1f792\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.964693 kubelet[2210]: I0424 23:35:13.964607 2210 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/11b5bcd918dbbbfa727a9b00c6d1f792-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" (UID: \"11b5bcd918dbbbfa727a9b00c6d1f792\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.964693 kubelet[2210]: I0424 23:35:13.964643 2210 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/11b5bcd918dbbbfa727a9b00c6d1f792-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" (UID: \"11b5bcd918dbbbfa727a9b00c6d1f792\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.965171 kubelet[2210]: I0424 23:35:13.964702 2210 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d1482e8f637c7e8785ce3d7f8596b313-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-0494d1f24d\" (UID: \"d1482e8f637c7e8785ce3d7f8596b313\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.965171 kubelet[2210]: I0424 23:35:13.964755 2210 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d1482e8f637c7e8785ce3d7f8596b313-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-0494d1f24d\" (UID: \"d1482e8f637c7e8785ce3d7f8596b313\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.965171 kubelet[2210]: I0424 23:35:13.964789 2210 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/11b5bcd918dbbbfa727a9b00c6d1f792-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" (UID: \"11b5bcd918dbbbfa727a9b00c6d1f792\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:13.965171 kubelet[2210]: I0424 23:35:13.964836 2210 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/576488f2b4b68f1ff6fe63eb563ef59c-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-0494d1f24d\" (UID: \"576488f2b4b68f1ff6fe63eb563ef59c\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:14.002305 kubelet[2210]: E0424 23:35:14.002057 2210 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://178.105.28.58:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-0494d1f24d?timeout=10s\": dial tcp 178.105.28.58:6443: connect: connection refused" interval="400ms" Apr 24 23:35:14.152098 kubelet[2210]: I0424 23:35:14.152058 2210 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:14.152586 kubelet[2210]: E0424 23:35:14.152551 2210 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://178.105.28.58:6443/api/v1/nodes\": dial tcp 178.105.28.58:6443: connect: connection refused" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:14.221184 containerd[1462]: time="2026-04-24T23:35:14.220810657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-0494d1f24d,Uid:d1482e8f637c7e8785ce3d7f8596b313,Namespace:kube-system,Attempt:0,}" Apr 24 23:35:14.234354 containerd[1462]: time="2026-04-24T23:35:14.234293210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-0494d1f24d,Uid:11b5bcd918dbbbfa727a9b00c6d1f792,Namespace:kube-system,Attempt:0,}" Apr 24 23:35:14.239680 containerd[1462]: time="2026-04-24T23:35:14.239635237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-0494d1f24d,Uid:576488f2b4b68f1ff6fe63eb563ef59c,Namespace:kube-system,Attempt:0,}" Apr 24 23:35:14.403579 kubelet[2210]: E0424 23:35:14.403421 2210 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://178.105.28.58:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-0494d1f24d?timeout=10s\": dial tcp 178.105.28.58:6443: connect: connection refused" interval="800ms" Apr 24 23:35:14.555728 kubelet[2210]: I0424 23:35:14.555675 2210 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:14.556292 kubelet[2210]: E0424 23:35:14.556243 2210 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://178.105.28.58:6443/api/v1/nodes\": dial tcp 178.105.28.58:6443: connect: connection refused" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:14.657926 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4012796135.mount: Deactivated successfully. Apr 24 23:35:14.668393 containerd[1462]: time="2026-04-24T23:35:14.668317706Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:35:14.671753 containerd[1462]: time="2026-04-24T23:35:14.671706709Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Apr 24 23:35:14.672669 containerd[1462]: time="2026-04-24T23:35:14.672638474Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:35:14.674605 containerd[1462]: time="2026-04-24T23:35:14.674538634Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:35:14.676147 containerd[1462]: time="2026-04-24T23:35:14.675955378Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:35:14.678457 containerd[1462]: time="2026-04-24T23:35:14.678277195Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 24 23:35:14.679959 containerd[1462]: time="2026-04-24T23:35:14.679905669Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Apr 24 23:35:14.681245 containerd[1462]: time="2026-04-24T23:35:14.681185428Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Apr 24 23:35:14.683162 containerd[1462]: time="2026-04-24T23:35:14.682043450Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 461.144741ms" Apr 24 23:35:14.686636 containerd[1462]: time="2026-04-24T23:35:14.686571339Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 446.821801ms" Apr 24 23:35:14.698480 containerd[1462]: time="2026-04-24T23:35:14.698408345Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 463.852186ms" Apr 24 23:35:14.835614 containerd[1462]: time="2026-04-24T23:35:14.835366444Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:14.835614 containerd[1462]: time="2026-04-24T23:35:14.835435779Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:14.835614 containerd[1462]: time="2026-04-24T23:35:14.835447040Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:14.835614 containerd[1462]: time="2026-04-24T23:35:14.835532245Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:14.837619 containerd[1462]: time="2026-04-24T23:35:14.837523943Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:14.837619 containerd[1462]: time="2026-04-24T23:35:14.837586264Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:14.837834 containerd[1462]: time="2026-04-24T23:35:14.837602856Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:14.837980 containerd[1462]: time="2026-04-24T23:35:14.837943355Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:14.842582 containerd[1462]: time="2026-04-24T23:35:14.842482306Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:14.842582 containerd[1462]: time="2026-04-24T23:35:14.842544787Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:14.842582 containerd[1462]: time="2026-04-24T23:35:14.842560818Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:14.843110 containerd[1462]: time="2026-04-24T23:35:14.842893262Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:14.876509 systemd[1]: Started cri-containerd-62f09dd1fde48adaaf896840594f640c0ecdead010dd86279368827653b2db18.scope - libcontainer container 62f09dd1fde48adaaf896840594f640c0ecdead010dd86279368827653b2db18. Apr 24 23:35:14.880720 systemd[1]: Started cri-containerd-7f2cd455099493757acdd2189ec87a15b6a47785a6c46f435a36f4c41c110ca6.scope - libcontainer container 7f2cd455099493757acdd2189ec87a15b6a47785a6c46f435a36f4c41c110ca6. Apr 24 23:35:14.883068 systemd[1]: Started cri-containerd-a3017b4c67a4feaaff5dc3d3c5f51d226ce35ca7fc9760a51b8d19bc989ad5e1.scope - libcontainer container a3017b4c67a4feaaff5dc3d3c5f51d226ce35ca7fc9760a51b8d19bc989ad5e1. Apr 24 23:35:14.937914 containerd[1462]: time="2026-04-24T23:35:14.937877066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-0494d1f24d,Uid:d1482e8f637c7e8785ce3d7f8596b313,Namespace:kube-system,Attempt:0,} returns sandbox id \"a3017b4c67a4feaaff5dc3d3c5f51d226ce35ca7fc9760a51b8d19bc989ad5e1\"" Apr 24 23:35:14.943028 containerd[1462]: time="2026-04-24T23:35:14.942987003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-0494d1f24d,Uid:11b5bcd918dbbbfa727a9b00c6d1f792,Namespace:kube-system,Attempt:0,} returns sandbox id \"62f09dd1fde48adaaf896840594f640c0ecdead010dd86279368827653b2db18\"" Apr 24 23:35:14.945905 containerd[1462]: time="2026-04-24T23:35:14.945861450Z" level=info msg="CreateContainer within sandbox \"a3017b4c67a4feaaff5dc3d3c5f51d226ce35ca7fc9760a51b8d19bc989ad5e1\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Apr 24 23:35:14.950531 containerd[1462]: time="2026-04-24T23:35:14.950297642Z" level=info msg="CreateContainer within sandbox \"62f09dd1fde48adaaf896840594f640c0ecdead010dd86279368827653b2db18\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Apr 24 23:35:14.964493 containerd[1462]: time="2026-04-24T23:35:14.964452818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-0494d1f24d,Uid:576488f2b4b68f1ff6fe63eb563ef59c,Namespace:kube-system,Attempt:0,} returns sandbox id \"7f2cd455099493757acdd2189ec87a15b6a47785a6c46f435a36f4c41c110ca6\"" Apr 24 23:35:14.972765 containerd[1462]: time="2026-04-24T23:35:14.972716062Z" level=info msg="CreateContainer within sandbox \"7f2cd455099493757acdd2189ec87a15b6a47785a6c46f435a36f4c41c110ca6\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Apr 24 23:35:14.983385 containerd[1462]: time="2026-04-24T23:35:14.983324128Z" level=info msg="CreateContainer within sandbox \"a3017b4c67a4feaaff5dc3d3c5f51d226ce35ca7fc9760a51b8d19bc989ad5e1\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"bb0e0f068aaff19f6bf6507d1185514f04c184160325e0d5609d00c7058ecff6\"" Apr 24 23:35:14.984212 containerd[1462]: time="2026-04-24T23:35:14.984180827Z" level=info msg="StartContainer for \"bb0e0f068aaff19f6bf6507d1185514f04c184160325e0d5609d00c7058ecff6\"" Apr 24 23:35:14.990428 containerd[1462]: time="2026-04-24T23:35:14.990087948Z" level=info msg="CreateContainer within sandbox \"62f09dd1fde48adaaf896840594f640c0ecdead010dd86279368827653b2db18\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"e0c59f5b1a040bf8a99331654be53d146aa3db40a01642ddcbff39d35b40c31f\"" Apr 24 23:35:14.991568 containerd[1462]: time="2026-04-24T23:35:14.991388547Z" level=info msg="StartContainer for \"e0c59f5b1a040bf8a99331654be53d146aa3db40a01642ddcbff39d35b40c31f\"" Apr 24 23:35:15.002429 containerd[1462]: time="2026-04-24T23:35:15.002108229Z" level=info msg="CreateContainer within sandbox \"7f2cd455099493757acdd2189ec87a15b6a47785a6c46f435a36f4c41c110ca6\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"9944747f15067a6db65a86f0c00c3b283aff936a710a0c0406d093b735fe4abc\"" Apr 24 23:35:15.003158 containerd[1462]: time="2026-04-24T23:35:15.003109165Z" level=info msg="StartContainer for \"9944747f15067a6db65a86f0c00c3b283aff936a710a0c0406d093b735fe4abc\"" Apr 24 23:35:15.016364 systemd[1]: Started cri-containerd-bb0e0f068aaff19f6bf6507d1185514f04c184160325e0d5609d00c7058ecff6.scope - libcontainer container bb0e0f068aaff19f6bf6507d1185514f04c184160325e0d5609d00c7058ecff6. Apr 24 23:35:15.036441 systemd[1]: Started cri-containerd-e0c59f5b1a040bf8a99331654be53d146aa3db40a01642ddcbff39d35b40c31f.scope - libcontainer container e0c59f5b1a040bf8a99331654be53d146aa3db40a01642ddcbff39d35b40c31f. Apr 24 23:35:15.066519 systemd[1]: Started cri-containerd-9944747f15067a6db65a86f0c00c3b283aff936a710a0c0406d093b735fe4abc.scope - libcontainer container 9944747f15067a6db65a86f0c00c3b283aff936a710a0c0406d093b735fe4abc. Apr 24 23:35:15.084412 containerd[1462]: time="2026-04-24T23:35:15.084367707Z" level=info msg="StartContainer for \"bb0e0f068aaff19f6bf6507d1185514f04c184160325e0d5609d00c7058ecff6\" returns successfully" Apr 24 23:35:15.114242 containerd[1462]: time="2026-04-24T23:35:15.114185331Z" level=info msg="StartContainer for \"e0c59f5b1a040bf8a99331654be53d146aa3db40a01642ddcbff39d35b40c31f\" returns successfully" Apr 24 23:35:15.148291 containerd[1462]: time="2026-04-24T23:35:15.148243022Z" level=info msg="StartContainer for \"9944747f15067a6db65a86f0c00c3b283aff936a710a0c0406d093b735fe4abc\" returns successfully" Apr 24 23:35:15.204975 kubelet[2210]: E0424 23:35:15.204840 2210 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://178.105.28.58:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-0494d1f24d?timeout=10s\": dial tcp 178.105.28.58:6443: connect: connection refused" interval="1.6s" Apr 24 23:35:15.358695 kubelet[2210]: I0424 23:35:15.358659 2210 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:15.814629 kubelet[2210]: E0424 23:35:15.814591 2210 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-0494d1f24d\" not found" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:15.818818 kubelet[2210]: E0424 23:35:15.818786 2210 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-0494d1f24d\" not found" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:15.820726 kubelet[2210]: E0424 23:35:15.820697 2210 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-0494d1f24d\" not found" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:16.822926 kubelet[2210]: E0424 23:35:16.822755 2210 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-0494d1f24d\" not found" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:16.823711 kubelet[2210]: E0424 23:35:16.823675 2210 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-0494d1f24d\" not found" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:17.134712 kubelet[2210]: E0424 23:35:17.134365 2210 nodelease.go:50] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-6-n-0494d1f24d\" not found" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:17.158454 kubelet[2210]: I0424 23:35:17.158392 2210 kubelet_node_status.go:77] "Successfully registered node" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:17.158837 kubelet[2210]: E0424 23:35:17.158438 2210 kubelet_node_status.go:474] "Error updating node status, will retry" err="error getting node \"ci-4081-3-6-n-0494d1f24d\": node \"ci-4081-3-6-n-0494d1f24d\" not found" Apr 24 23:35:17.160004 kubelet[2210]: E0424 23:35:17.159684 2210 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4081-3-6-n-0494d1f24d.18a96f2097cd0d0b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-n-0494d1f24d,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-0494d1f24d,},FirstTimestamp:2026-04-24 23:35:13.750805771 +0000 UTC m=+0.291867865,LastTimestamp:2026-04-24 23:35:13.750805771 +0000 UTC m=+0.291867865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-0494d1f24d,}" Apr 24 23:35:17.168034 kubelet[2210]: I0424 23:35:17.167993 2210 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:17.187008 kubelet[2210]: E0424 23:35:17.186966 2210 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-0494d1f24d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:17.187008 kubelet[2210]: I0424 23:35:17.187007 2210 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:17.192793 kubelet[2210]: E0424 23:35:17.192533 2210 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:17.192793 kubelet[2210]: I0424 23:35:17.192599 2210 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:17.196347 kubelet[2210]: E0424 23:35:17.196307 2210 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-0494d1f24d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:17.747554 kubelet[2210]: I0424 23:35:17.747488 2210 apiserver.go:52] "Watching apiserver" Apr 24 23:35:17.763723 kubelet[2210]: I0424 23:35:17.763669 2210 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 24 23:35:18.737263 kubelet[2210]: I0424 23:35:18.737220 2210 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:19.004380 kubelet[2210]: I0424 23:35:19.004148 2210 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:19.413452 systemd[1]: Reloading requested from client PID 2496 ('systemctl') (unit session-7.scope)... Apr 24 23:35:19.413473 systemd[1]: Reloading... Apr 24 23:35:19.527196 zram_generator::config[2537]: No configuration found. Apr 24 23:35:19.629199 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Apr 24 23:35:19.716595 systemd[1]: Reloading finished in 302 ms. Apr 24 23:35:19.763495 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:35:19.782649 systemd[1]: kubelet.service: Deactivated successfully. Apr 24 23:35:19.783276 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:35:19.790784 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Apr 24 23:35:19.948442 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Apr 24 23:35:19.951026 (kubelet)[2581]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Apr 24 23:35:20.012922 kubelet[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:35:20.022535 kubelet[2581]: I0424 23:35:20.022473 2581 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Apr 24 23:35:20.022535 kubelet[2581]: I0424 23:35:20.022526 2581 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:35:20.022535 kubelet[2581]: I0424 23:35:20.022548 2581 watchdog_linux.go:95] "Systemd watchdog is not enabled" Apr 24 23:35:20.022708 kubelet[2581]: I0424 23:35:20.022554 2581 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:35:20.022983 kubelet[2581]: I0424 23:35:20.022948 2581 server.go:951] "Client rotation is on, will bootstrap in background" Apr 24 23:35:20.032531 kubelet[2581]: I0424 23:35:20.031421 2581 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Apr 24 23:35:20.040256 kubelet[2581]: I0424 23:35:20.039850 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Apr 24 23:35:20.044266 kubelet[2581]: E0424 23:35:20.044224 2581 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Apr 24 23:35:20.044397 kubelet[2581]: I0424 23:35:20.044294 2581 server.go:1395] "CRI implementation should be updated to support RuntimeConfig. Falling back to using cgroupDriver from kubelet config." Apr 24 23:35:20.047142 kubelet[2581]: I0424 23:35:20.046476 2581 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Apr 24 23:35:20.047142 kubelet[2581]: I0424 23:35:20.046663 2581 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:35:20.047142 kubelet[2581]: I0424 23:35:20.046686 2581 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-0494d1f24d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:35:20.047142 kubelet[2581]: I0424 23:35:20.046852 2581 topology_manager.go:143] "Creating topology manager with none policy" Apr 24 23:35:20.047374 kubelet[2581]: I0424 23:35:20.046860 2581 container_manager_linux.go:308] "Creating device plugin manager" Apr 24 23:35:20.047374 kubelet[2581]: I0424 23:35:20.046881 2581 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Apr 24 23:35:20.047374 kubelet[2581]: I0424 23:35:20.047074 2581 state_mem.go:41] "Initialized" logger="CPUManager state memory" Apr 24 23:35:20.047374 kubelet[2581]: I0424 23:35:20.047234 2581 kubelet.go:482] "Attempting to sync node with API server" Apr 24 23:35:20.047374 kubelet[2581]: I0424 23:35:20.047248 2581 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:35:20.047374 kubelet[2581]: I0424 23:35:20.047264 2581 kubelet.go:394] "Adding apiserver pod source" Apr 24 23:35:20.047374 kubelet[2581]: I0424 23:35:20.047284 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:35:20.050150 kubelet[2581]: I0424 23:35:20.048785 2581 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Apr 24 23:35:20.050150 kubelet[2581]: I0424 23:35:20.049674 2581 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:35:20.050150 kubelet[2581]: I0424 23:35:20.049704 2581 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Apr 24 23:35:20.056492 kubelet[2581]: I0424 23:35:20.056455 2581 server.go:1257] "Started kubelet" Apr 24 23:35:20.065675 kubelet[2581]: I0424 23:35:20.059251 2581 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:35:20.065675 kubelet[2581]: I0424 23:35:20.059328 2581 server_v1.go:49] "podresources" method="list" useActivePods=true Apr 24 23:35:20.065675 kubelet[2581]: I0424 23:35:20.059535 2581 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:35:20.065675 kubelet[2581]: I0424 23:35:20.059598 2581 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:35:20.066422 kubelet[2581]: I0424 23:35:20.066402 2581 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Apr 24 23:35:20.067017 kubelet[2581]: I0424 23:35:20.066976 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:35:20.072808 kubelet[2581]: I0424 23:35:20.072775 2581 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Apr 24 23:35:20.073971 kubelet[2581]: I0424 23:35:20.073390 2581 volume_manager.go:311] "Starting Kubelet Volume Manager" Apr 24 23:35:20.073971 kubelet[2581]: E0424 23:35:20.073625 2581 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-0494d1f24d\" not found" Apr 24 23:35:20.074559 kubelet[2581]: I0424 23:35:20.074539 2581 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Apr 24 23:35:20.074739 kubelet[2581]: I0424 23:35:20.074730 2581 reconciler.go:29] "Reconciler: start to sync state" Apr 24 23:35:20.097900 kubelet[2581]: I0424 23:35:20.097861 2581 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:35:20.098037 kubelet[2581]: I0424 23:35:20.097981 2581 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Apr 24 23:35:20.101860 kubelet[2581]: I0424 23:35:20.101830 2581 factory.go:223] Registration of the containerd container factory successfully Apr 24 23:35:20.107457 kubelet[2581]: I0424 23:35:20.107396 2581 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Apr 24 23:35:20.111202 kubelet[2581]: I0424 23:35:20.110287 2581 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Apr 24 23:35:20.111202 kubelet[2581]: I0424 23:35:20.110324 2581 status_manager.go:249] "Starting to sync pod status with apiserver" Apr 24 23:35:20.111202 kubelet[2581]: I0424 23:35:20.110352 2581 kubelet.go:2501] "Starting kubelet main sync loop" Apr 24 23:35:20.111202 kubelet[2581]: E0424 23:35:20.110414 2581 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 24 23:35:20.182585 kubelet[2581]: I0424 23:35:20.182478 2581 cpu_manager.go:225] "Starting" policy="none" Apr 24 23:35:20.183288 kubelet[2581]: I0424 23:35:20.183219 2581 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Apr 24 23:35:20.183288 kubelet[2581]: I0424 23:35:20.183264 2581 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Apr 24 23:35:20.183477 kubelet[2581]: I0424 23:35:20.183413 2581 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Apr 24 23:35:20.183477 kubelet[2581]: I0424 23:35:20.183425 2581 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Apr 24 23:35:20.183477 kubelet[2581]: I0424 23:35:20.183445 2581 policy_none.go:50] "Start" Apr 24 23:35:20.183477 kubelet[2581]: I0424 23:35:20.183456 2581 memory_manager.go:187] "Starting memorymanager" policy="None" Apr 24 23:35:20.183477 kubelet[2581]: I0424 23:35:20.183465 2581 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Apr 24 23:35:20.183584 kubelet[2581]: I0424 23:35:20.183567 2581 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Apr 24 23:35:20.183584 kubelet[2581]: I0424 23:35:20.183581 2581 policy_none.go:44] "Start" Apr 24 23:35:20.191909 kubelet[2581]: E0424 23:35:20.191655 2581 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:35:20.193791 kubelet[2581]: I0424 23:35:20.193140 2581 eviction_manager.go:194] "Eviction manager: starting control loop" Apr 24 23:35:20.193949 kubelet[2581]: I0424 23:35:20.193909 2581 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:35:20.194554 kubelet[2581]: I0424 23:35:20.194517 2581 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Apr 24 23:35:20.197142 kubelet[2581]: E0424 23:35:20.196092 2581 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Apr 24 23:35:20.213532 kubelet[2581]: I0424 23:35:20.211936 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.213532 kubelet[2581]: I0424 23:35:20.212627 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.213532 kubelet[2581]: I0424 23:35:20.213137 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.224636 kubelet[2581]: E0424 23:35:20.224465 2581 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" already exists" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.226464 kubelet[2581]: E0424 23:35:20.226384 2581 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-0494d1f24d\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.278106 kubelet[2581]: I0424 23:35:20.277765 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/11b5bcd918dbbbfa727a9b00c6d1f792-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" (UID: \"11b5bcd918dbbbfa727a9b00c6d1f792\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.278106 kubelet[2581]: I0424 23:35:20.277815 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/11b5bcd918dbbbfa727a9b00c6d1f792-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" (UID: \"11b5bcd918dbbbfa727a9b00c6d1f792\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.278106 kubelet[2581]: I0424 23:35:20.277839 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/11b5bcd918dbbbfa727a9b00c6d1f792-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" (UID: \"11b5bcd918dbbbfa727a9b00c6d1f792\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.278106 kubelet[2581]: I0424 23:35:20.277859 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/576488f2b4b68f1ff6fe63eb563ef59c-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-0494d1f24d\" (UID: \"576488f2b4b68f1ff6fe63eb563ef59c\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.278106 kubelet[2581]: I0424 23:35:20.277877 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d1482e8f637c7e8785ce3d7f8596b313-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-0494d1f24d\" (UID: \"d1482e8f637c7e8785ce3d7f8596b313\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.278403 kubelet[2581]: I0424 23:35:20.277895 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d1482e8f637c7e8785ce3d7f8596b313-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-0494d1f24d\" (UID: \"d1482e8f637c7e8785ce3d7f8596b313\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.278403 kubelet[2581]: I0424 23:35:20.277915 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d1482e8f637c7e8785ce3d7f8596b313-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-0494d1f24d\" (UID: \"d1482e8f637c7e8785ce3d7f8596b313\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.278403 kubelet[2581]: I0424 23:35:20.277932 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/11b5bcd918dbbbfa727a9b00c6d1f792-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" (UID: \"11b5bcd918dbbbfa727a9b00c6d1f792\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.278403 kubelet[2581]: I0424 23:35:20.277951 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/11b5bcd918dbbbfa727a9b00c6d1f792-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-0494d1f24d\" (UID: \"11b5bcd918dbbbfa727a9b00c6d1f792\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.309144 kubelet[2581]: I0424 23:35:20.308919 2581 kubelet_node_status.go:74] "Attempting to register node" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.323146 kubelet[2581]: I0424 23:35:20.323073 2581 kubelet_node_status.go:123] "Node was previously registered" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:20.323275 kubelet[2581]: I0424 23:35:20.323198 2581 kubelet_node_status.go:77] "Successfully registered node" node="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:21.048294 kubelet[2581]: I0424 23:35:21.048233 2581 apiserver.go:52] "Watching apiserver" Apr 24 23:35:21.074781 kubelet[2581]: I0424 23:35:21.074746 2581 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Apr 24 23:35:21.159783 kubelet[2581]: I0424 23:35:21.158987 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:21.160613 kubelet[2581]: I0424 23:35:21.160305 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:21.172362 kubelet[2581]: E0424 23:35:21.172321 2581 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-0494d1f24d\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:21.178635 kubelet[2581]: E0424 23:35:21.178375 2581 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-0494d1f24d\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:21.814130 kubelet[2581]: I0424 23:35:21.813932 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-0494d1f24d" podStartSLOduration=3.813894722 podStartE2EDuration="3.813894722s" podCreationTimestamp="2026-04-24 23:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:35:21.800021558 +0000 UTC m=+1.843579914" watchObservedRunningTime="2026-04-24 23:35:21.813894722 +0000 UTC m=+1.857453038" Apr 24 23:35:21.829436 kubelet[2581]: I0424 23:35:21.829234 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-6-n-0494d1f24d" podStartSLOduration=1.829216629 podStartE2EDuration="1.829216629s" podCreationTimestamp="2026-04-24 23:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:35:21.813894762 +0000 UTC m=+1.857453118" watchObservedRunningTime="2026-04-24 23:35:21.829216629 +0000 UTC m=+1.872774945" Apr 24 23:35:23.969549 kubelet[2581]: I0424 23:35:23.969099 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-6-n-0494d1f24d" podStartSLOduration=4.969085312 podStartE2EDuration="4.969085312s" podCreationTimestamp="2026-04-24 23:35:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:35:21.829797431 +0000 UTC m=+1.873355747" watchObservedRunningTime="2026-04-24 23:35:23.969085312 +0000 UTC m=+4.012643588" Apr 24 23:35:23.993698 kubelet[2581]: I0424 23:35:23.993623 2581 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Apr 24 23:35:23.994453 containerd[1462]: time="2026-04-24T23:35:23.994390227Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Apr 24 23:35:23.996056 kubelet[2581]: I0424 23:35:23.995407 2581 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Apr 24 23:35:24.773010 systemd[1]: Created slice kubepods-besteffort-pod1ace00d4_5d52_41f3_a96f_7b26ef2755bf.slice - libcontainer container kubepods-besteffort-pod1ace00d4_5d52_41f3_a96f_7b26ef2755bf.slice. Apr 24 23:35:24.806716 kubelet[2581]: I0424 23:35:24.806598 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/1ace00d4-5d52-41f3-a96f-7b26ef2755bf-kube-proxy\") pod \"kube-proxy-pzqkw\" (UID: \"1ace00d4-5d52-41f3-a96f-7b26ef2755bf\") " pod="kube-system/kube-proxy-pzqkw" Apr 24 23:35:24.806716 kubelet[2581]: I0424 23:35:24.806639 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1ace00d4-5d52-41f3-a96f-7b26ef2755bf-xtables-lock\") pod \"kube-proxy-pzqkw\" (UID: \"1ace00d4-5d52-41f3-a96f-7b26ef2755bf\") " pod="kube-system/kube-proxy-pzqkw" Apr 24 23:35:24.806716 kubelet[2581]: I0424 23:35:24.806654 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1ace00d4-5d52-41f3-a96f-7b26ef2755bf-lib-modules\") pod \"kube-proxy-pzqkw\" (UID: \"1ace00d4-5d52-41f3-a96f-7b26ef2755bf\") " pod="kube-system/kube-proxy-pzqkw" Apr 24 23:35:24.806716 kubelet[2581]: I0424 23:35:24.806669 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvj92\" (UniqueName: \"kubernetes.io/projected/1ace00d4-5d52-41f3-a96f-7b26ef2755bf-kube-api-access-vvj92\") pod \"kube-proxy-pzqkw\" (UID: \"1ace00d4-5d52-41f3-a96f-7b26ef2755bf\") " pod="kube-system/kube-proxy-pzqkw" Apr 24 23:35:25.087647 containerd[1462]: time="2026-04-24T23:35:25.086840837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pzqkw,Uid:1ace00d4-5d52-41f3-a96f-7b26ef2755bf,Namespace:kube-system,Attempt:0,}" Apr 24 23:35:25.121148 containerd[1462]: time="2026-04-24T23:35:25.119196844Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:25.121148 containerd[1462]: time="2026-04-24T23:35:25.119312976Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:25.121148 containerd[1462]: time="2026-04-24T23:35:25.119329624Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:25.121148 containerd[1462]: time="2026-04-24T23:35:25.119465724Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:25.158570 systemd[1]: Started cri-containerd-67611f21b3448c45a0684ba8ecee1c5c3528701576a7bcdf182ecd18d822dea9.scope - libcontainer container 67611f21b3448c45a0684ba8ecee1c5c3528701576a7bcdf182ecd18d822dea9. Apr 24 23:35:25.189873 containerd[1462]: time="2026-04-24T23:35:25.189831342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-pzqkw,Uid:1ace00d4-5d52-41f3-a96f-7b26ef2755bf,Namespace:kube-system,Attempt:0,} returns sandbox id \"67611f21b3448c45a0684ba8ecee1c5c3528701576a7bcdf182ecd18d822dea9\"" Apr 24 23:35:25.197799 containerd[1462]: time="2026-04-24T23:35:25.197755791Z" level=info msg="CreateContainer within sandbox \"67611f21b3448c45a0684ba8ecee1c5c3528701576a7bcdf182ecd18d822dea9\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Apr 24 23:35:25.216223 containerd[1462]: time="2026-04-24T23:35:25.216084383Z" level=info msg="CreateContainer within sandbox \"67611f21b3448c45a0684ba8ecee1c5c3528701576a7bcdf182ecd18d822dea9\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"a6cd41e532860fbc74d7be109793695eeffc0cbfa2bcfa9c4e2c119d7b4736ab\"" Apr 24 23:35:25.219074 containerd[1462]: time="2026-04-24T23:35:25.216970444Z" level=info msg="StartContainer for \"a6cd41e532860fbc74d7be109793695eeffc0cbfa2bcfa9c4e2c119d7b4736ab\"" Apr 24 23:35:25.264403 systemd[1]: Started cri-containerd-a6cd41e532860fbc74d7be109793695eeffc0cbfa2bcfa9c4e2c119d7b4736ab.scope - libcontainer container a6cd41e532860fbc74d7be109793695eeffc0cbfa2bcfa9c4e2c119d7b4736ab. Apr 24 23:35:25.307902 systemd[1]: Created slice kubepods-besteffort-pod9763af56_09de_44ca_8744_27c67c8af577.slice - libcontainer container kubepods-besteffort-pod9763af56_09de_44ca_8744_27c67c8af577.slice. Apr 24 23:35:25.312492 kubelet[2581]: I0424 23:35:25.312439 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqr6\" (UniqueName: \"kubernetes.io/projected/9763af56-09de-44ca-8744-27c67c8af577-kube-api-access-xxqr6\") pod \"tigera-operator-6cf4cccc57-sg749\" (UID: \"9763af56-09de-44ca-8744-27c67c8af577\") " pod="tigera-operator/tigera-operator-6cf4cccc57-sg749" Apr 24 23:35:25.312835 kubelet[2581]: I0424 23:35:25.312517 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9763af56-09de-44ca-8744-27c67c8af577-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-sg749\" (UID: \"9763af56-09de-44ca-8744-27c67c8af577\") " pod="tigera-operator/tigera-operator-6cf4cccc57-sg749" Apr 24 23:35:25.336760 containerd[1462]: time="2026-04-24T23:35:25.336612271Z" level=info msg="StartContainer for \"a6cd41e532860fbc74d7be109793695eeffc0cbfa2bcfa9c4e2c119d7b4736ab\" returns successfully" Apr 24 23:35:25.618161 containerd[1462]: time="2026-04-24T23:35:25.618005389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-sg749,Uid:9763af56-09de-44ca-8744-27c67c8af577,Namespace:tigera-operator,Attempt:0,}" Apr 24 23:35:25.645544 containerd[1462]: time="2026-04-24T23:35:25.645298012Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:25.645544 containerd[1462]: time="2026-04-24T23:35:25.645364287Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:25.645544 containerd[1462]: time="2026-04-24T23:35:25.645393982Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:25.645544 containerd[1462]: time="2026-04-24T23:35:25.645495635Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:25.667395 systemd[1]: Started cri-containerd-d88b8268c4aa927f5ca80156085725dbd5a6aa0ae8b3c1b91dc49a08a36a8b7a.scope - libcontainer container d88b8268c4aa927f5ca80156085725dbd5a6aa0ae8b3c1b91dc49a08a36a8b7a. Apr 24 23:35:25.709469 containerd[1462]: time="2026-04-24T23:35:25.709344508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-sg749,Uid:9763af56-09de-44ca-8744-27c67c8af577,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"d88b8268c4aa927f5ca80156085725dbd5a6aa0ae8b3c1b91dc49a08a36a8b7a\"" Apr 24 23:35:25.713276 containerd[1462]: time="2026-04-24T23:35:25.713230653Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Apr 24 23:35:27.439335 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1923740573.mount: Deactivated successfully. Apr 24 23:35:27.883850 containerd[1462]: time="2026-04-24T23:35:27.883492117Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:27.884870 containerd[1462]: time="2026-04-24T23:35:27.884825203Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=25071565" Apr 24 23:35:27.886084 containerd[1462]: time="2026-04-24T23:35:27.886033009Z" level=info msg="ImageCreate event name:\"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:27.891545 containerd[1462]: time="2026-04-24T23:35:27.891497596Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:27.892754 containerd[1462]: time="2026-04-24T23:35:27.892550744Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"25067560\" in 2.179267183s" Apr 24 23:35:27.892754 containerd[1462]: time="2026-04-24T23:35:27.892704201Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:b2fef69c2456aa0a6f6dcb63425a69d11dc35a73b1883b250e4d92f5a697fefe\"" Apr 24 23:35:27.897752 containerd[1462]: time="2026-04-24T23:35:27.897710298Z" level=info msg="CreateContainer within sandbox \"d88b8268c4aa927f5ca80156085725dbd5a6aa0ae8b3c1b91dc49a08a36a8b7a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Apr 24 23:35:27.914239 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount861219913.mount: Deactivated successfully. Apr 24 23:35:27.917547 containerd[1462]: time="2026-04-24T23:35:27.917497211Z" level=info msg="CreateContainer within sandbox \"d88b8268c4aa927f5ca80156085725dbd5a6aa0ae8b3c1b91dc49a08a36a8b7a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"c32641b7cc6f4e7152353812bcc8d1582cb2ebc595f4c6be352adb53b8143fa0\"" Apr 24 23:35:27.919867 containerd[1462]: time="2026-04-24T23:35:27.919824888Z" level=info msg="StartContainer for \"c32641b7cc6f4e7152353812bcc8d1582cb2ebc595f4c6be352adb53b8143fa0\"" Apr 24 23:35:27.953329 systemd[1]: Started cri-containerd-c32641b7cc6f4e7152353812bcc8d1582cb2ebc595f4c6be352adb53b8143fa0.scope - libcontainer container c32641b7cc6f4e7152353812bcc8d1582cb2ebc595f4c6be352adb53b8143fa0. Apr 24 23:35:27.985070 containerd[1462]: time="2026-04-24T23:35:27.984813520Z" level=info msg="StartContainer for \"c32641b7cc6f4e7152353812bcc8d1582cb2ebc595f4c6be352adb53b8143fa0\" returns successfully" Apr 24 23:35:28.198544 kubelet[2581]: I0424 23:35:28.198176 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-pzqkw" podStartSLOduration=4.19815507 podStartE2EDuration="4.19815507s" podCreationTimestamp="2026-04-24 23:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:35:26.187532055 +0000 UTC m=+6.231090371" watchObservedRunningTime="2026-04-24 23:35:28.19815507 +0000 UTC m=+8.241713426" Apr 24 23:35:29.566095 kubelet[2581]: I0424 23:35:29.565933 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-sg749" podStartSLOduration=2.384483537 podStartE2EDuration="4.565919383s" podCreationTimestamp="2026-04-24 23:35:25 +0000 UTC" firstStartedPulling="2026-04-24 23:35:25.712278436 +0000 UTC m=+5.755836752" lastFinishedPulling="2026-04-24 23:35:27.893714282 +0000 UTC m=+7.937272598" observedRunningTime="2026-04-24 23:35:28.199742942 +0000 UTC m=+8.243301258" watchObservedRunningTime="2026-04-24 23:35:29.565919383 +0000 UTC m=+9.609477659" Apr 24 23:35:34.273788 sudo[1712]: pam_unix(sudo:session): session closed for user root Apr 24 23:35:34.291601 sshd[1709]: pam_unix(sshd:session): session closed for user core Apr 24 23:35:34.296479 systemd[1]: sshd@6-178.105.28.58:22-50.85.169.122:39168.service: Deactivated successfully. Apr 24 23:35:34.300611 systemd[1]: session-7.scope: Deactivated successfully. Apr 24 23:35:34.301503 systemd[1]: session-7.scope: Consumed 5.915s CPU time, 149.8M memory peak, 0B memory swap peak. Apr 24 23:35:34.304936 systemd-logind[1455]: Session 7 logged out. Waiting for processes to exit. Apr 24 23:35:34.306425 systemd-logind[1455]: Removed session 7. Apr 24 23:35:40.082162 systemd[1]: Created slice kubepods-besteffort-pod5f0c11c3_4fbf_49a9_9e02_1bd577ad15f9.slice - libcontainer container kubepods-besteffort-pod5f0c11c3_4fbf_49a9_9e02_1bd577ad15f9.slice. Apr 24 23:35:40.109257 kubelet[2581]: I0424 23:35:40.109204 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f0c11c3-4fbf-49a9-9e02-1bd577ad15f9-tigera-ca-bundle\") pod \"calico-typha-d98747f4b-nlzzk\" (UID: \"5f0c11c3-4fbf-49a9-9e02-1bd577ad15f9\") " pod="calico-system/calico-typha-d98747f4b-nlzzk" Apr 24 23:35:40.109257 kubelet[2581]: I0424 23:35:40.109252 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/5f0c11c3-4fbf-49a9-9e02-1bd577ad15f9-typha-certs\") pod \"calico-typha-d98747f4b-nlzzk\" (UID: \"5f0c11c3-4fbf-49a9-9e02-1bd577ad15f9\") " pod="calico-system/calico-typha-d98747f4b-nlzzk" Apr 24 23:35:40.109663 kubelet[2581]: I0424 23:35:40.109277 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9hzt\" (UniqueName: \"kubernetes.io/projected/5f0c11c3-4fbf-49a9-9e02-1bd577ad15f9-kube-api-access-h9hzt\") pod \"calico-typha-d98747f4b-nlzzk\" (UID: \"5f0c11c3-4fbf-49a9-9e02-1bd577ad15f9\") " pod="calico-system/calico-typha-d98747f4b-nlzzk" Apr 24 23:35:40.198179 systemd[1]: Created slice kubepods-besteffort-pod8bcbed52_179b_4ac0_892c_2c0704a03414.slice - libcontainer container kubepods-besteffort-pod8bcbed52_179b_4ac0_892c_2c0704a03414.slice. Apr 24 23:35:40.297146 kubelet[2581]: E0424 23:35:40.295749 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drdkn" podUID="31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa" Apr 24 23:35:40.310874 kubelet[2581]: I0424 23:35:40.310793 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-var-run-calico\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.310874 kubelet[2581]: I0424 23:35:40.310842 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-lib-modules\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311041 kubelet[2581]: I0424 23:35:40.310906 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-bpffs\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311041 kubelet[2581]: I0424 23:35:40.310934 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-flexvol-driver-host\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311041 kubelet[2581]: I0424 23:35:40.310962 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-sys-fs\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311041 kubelet[2581]: I0424 23:35:40.310984 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcbed52-179b-4ac0-892c-2c0704a03414-tigera-ca-bundle\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311041 kubelet[2581]: I0424 23:35:40.310999 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-nodeproc\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311178 kubelet[2581]: I0424 23:35:40.311020 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvdpp\" (UniqueName: \"kubernetes.io/projected/8bcbed52-179b-4ac0-892c-2c0704a03414-kube-api-access-lvdpp\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311178 kubelet[2581]: I0424 23:35:40.311038 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-cni-bin-dir\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311178 kubelet[2581]: I0424 23:35:40.311053 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-cni-net-dir\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311178 kubelet[2581]: I0424 23:35:40.311066 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-policysync\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311178 kubelet[2581]: I0424 23:35:40.311080 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-cni-log-dir\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311292 kubelet[2581]: I0424 23:35:40.311102 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-var-lib-calico\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311292 kubelet[2581]: I0424 23:35:40.311141 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8bcbed52-179b-4ac0-892c-2c0704a03414-node-certs\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.311292 kubelet[2581]: I0424 23:35:40.311157 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8bcbed52-179b-4ac0-892c-2c0704a03414-xtables-lock\") pod \"calico-node-ldsmt\" (UID: \"8bcbed52-179b-4ac0-892c-2c0704a03414\") " pod="calico-system/calico-node-ldsmt" Apr 24 23:35:40.388690 containerd[1462]: time="2026-04-24T23:35:40.388443944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-d98747f4b-nlzzk,Uid:5f0c11c3-4fbf-49a9-9e02-1bd577ad15f9,Namespace:calico-system,Attempt:0,}" Apr 24 23:35:40.413124 kubelet[2581]: I0424 23:35:40.412615 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa-kubelet-dir\") pod \"csi-node-driver-drdkn\" (UID: \"31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa\") " pod="calico-system/csi-node-driver-drdkn" Apr 24 23:35:40.416064 kubelet[2581]: E0424 23:35:40.416027 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.416064 kubelet[2581]: W0424 23:35:40.416053 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.416064 kubelet[2581]: E0424 23:35:40.416075 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.416506 kubelet[2581]: E0424 23:35:40.416491 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.416506 kubelet[2581]: W0424 23:35:40.416506 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.416506 kubelet[2581]: E0424 23:35:40.416520 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.417297 kubelet[2581]: E0424 23:35:40.417275 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.417297 kubelet[2581]: W0424 23:35:40.417295 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.417446 kubelet[2581]: E0424 23:35:40.417312 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.418345 kubelet[2581]: E0424 23:35:40.418323 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.418439 kubelet[2581]: W0424 23:35:40.418343 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.418439 kubelet[2581]: E0424 23:35:40.418406 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.418631 kubelet[2581]: I0424 23:35:40.418508 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa-socket-dir\") pod \"csi-node-driver-drdkn\" (UID: \"31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa\") " pod="calico-system/csi-node-driver-drdkn" Apr 24 23:35:40.418660 kubelet[2581]: E0424 23:35:40.418648 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.418748 kubelet[2581]: W0424 23:35:40.418663 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.418748 kubelet[2581]: E0424 23:35:40.418675 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.419054 kubelet[2581]: E0424 23:35:40.419040 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.419054 kubelet[2581]: W0424 23:35:40.419054 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.419244 kubelet[2581]: E0424 23:35:40.419066 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.420439 kubelet[2581]: E0424 23:35:40.420409 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.420439 kubelet[2581]: W0424 23:35:40.420429 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.420636 kubelet[2581]: E0424 23:35:40.420446 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.420711 kubelet[2581]: E0424 23:35:40.420698 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.420788 kubelet[2581]: W0424 23:35:40.420710 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.420788 kubelet[2581]: E0424 23:35:40.420721 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.420877 kubelet[2581]: E0424 23:35:40.420864 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.420877 kubelet[2581]: W0424 23:35:40.420874 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.421020 kubelet[2581]: E0424 23:35:40.420883 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.421224 kubelet[2581]: E0424 23:35:40.421201 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.421224 kubelet[2581]: W0424 23:35:40.421214 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.421224 kubelet[2581]: E0424 23:35:40.421225 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.421440 kubelet[2581]: E0424 23:35:40.421428 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.421440 kubelet[2581]: W0424 23:35:40.421439 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.421440 kubelet[2581]: E0424 23:35:40.421449 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.422225 kubelet[2581]: E0424 23:35:40.422206 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.422225 kubelet[2581]: W0424 23:35:40.422224 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.422344 kubelet[2581]: E0424 23:35:40.422238 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.422493 kubelet[2581]: E0424 23:35:40.422478 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.422493 kubelet[2581]: W0424 23:35:40.422490 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.422575 kubelet[2581]: E0424 23:35:40.422500 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.422668 kubelet[2581]: E0424 23:35:40.422658 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.422668 kubelet[2581]: W0424 23:35:40.422667 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.422759 kubelet[2581]: E0424 23:35:40.422676 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.422822 kubelet[2581]: E0424 23:35:40.422817 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.422918 kubelet[2581]: W0424 23:35:40.422824 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.422918 kubelet[2581]: E0424 23:35:40.422832 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.423545 kubelet[2581]: E0424 23:35:40.423516 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.423620 kubelet[2581]: W0424 23:35:40.423547 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.423620 kubelet[2581]: E0424 23:35:40.423571 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.424470 kubelet[2581]: E0424 23:35:40.424345 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.424470 kubelet[2581]: W0424 23:35:40.424366 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.424470 kubelet[2581]: E0424 23:35:40.424380 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.424957 kubelet[2581]: E0424 23:35:40.424882 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.425249 kubelet[2581]: W0424 23:35:40.424991 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.425249 kubelet[2581]: E0424 23:35:40.425008 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.425965 kubelet[2581]: E0424 23:35:40.425942 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.426149 kubelet[2581]: W0424 23:35:40.426047 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.426149 kubelet[2581]: E0424 23:35:40.426101 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.426440 containerd[1462]: time="2026-04-24T23:35:40.425093189Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:40.426440 containerd[1462]: time="2026-04-24T23:35:40.426036768Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:40.426440 containerd[1462]: time="2026-04-24T23:35:40.426070859Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:40.426955 kubelet[2581]: E0424 23:35:40.426828 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.426955 kubelet[2581]: W0424 23:35:40.426844 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.426955 kubelet[2581]: E0424 23:35:40.426861 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.427058 kubelet[2581]: I0424 23:35:40.426987 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa-registration-dir\") pod \"csi-node-driver-drdkn\" (UID: \"31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa\") " pod="calico-system/csi-node-driver-drdkn" Apr 24 23:35:40.427454 kubelet[2581]: E0424 23:35:40.427355 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.427454 kubelet[2581]: W0424 23:35:40.427369 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.427454 kubelet[2581]: E0424 23:35:40.427381 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.427602 containerd[1462]: time="2026-04-24T23:35:40.427428289Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:40.428335 kubelet[2581]: E0424 23:35:40.427968 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.428335 kubelet[2581]: W0424 23:35:40.427989 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.428335 kubelet[2581]: E0424 23:35:40.428004 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.428335 kubelet[2581]: E0424 23:35:40.428226 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.428335 kubelet[2581]: W0424 23:35:40.428236 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.428335 kubelet[2581]: E0424 23:35:40.428246 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.429190 kubelet[2581]: E0424 23:35:40.429013 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.429190 kubelet[2581]: W0424 23:35:40.429035 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.429190 kubelet[2581]: E0424 23:35:40.429048 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.429633 kubelet[2581]: E0424 23:35:40.429507 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.429633 kubelet[2581]: W0424 23:35:40.429544 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.429633 kubelet[2581]: E0424 23:35:40.429568 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.430099 kubelet[2581]: E0424 23:35:40.429953 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.430099 kubelet[2581]: W0424 23:35:40.429967 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.430099 kubelet[2581]: E0424 23:35:40.429978 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.431542 kubelet[2581]: E0424 23:35:40.431429 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.431878 kubelet[2581]: W0424 23:35:40.431732 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.431878 kubelet[2581]: E0424 23:35:40.431759 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.432488 kubelet[2581]: E0424 23:35:40.432179 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.432488 kubelet[2581]: W0424 23:35:40.432193 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.432488 kubelet[2581]: E0424 23:35:40.432205 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.434296 kubelet[2581]: E0424 23:35:40.433550 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.434296 kubelet[2581]: W0424 23:35:40.434081 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.434296 kubelet[2581]: E0424 23:35:40.434103 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.436910 kubelet[2581]: E0424 23:35:40.436687 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.436910 kubelet[2581]: W0424 23:35:40.436707 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.436910 kubelet[2581]: E0424 23:35:40.436728 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.438599 kubelet[2581]: E0424 23:35:40.438427 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.438599 kubelet[2581]: W0424 23:35:40.438449 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.438599 kubelet[2581]: E0424 23:35:40.438467 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.439344 kubelet[2581]: E0424 23:35:40.439323 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.439587 kubelet[2581]: W0424 23:35:40.439444 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.439587 kubelet[2581]: E0424 23:35:40.439468 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.440283 kubelet[2581]: E0424 23:35:40.440247 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.440633 kubelet[2581]: W0424 23:35:40.440396 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.440633 kubelet[2581]: E0424 23:35:40.440419 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.441202 kubelet[2581]: E0424 23:35:40.441182 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.441260 kubelet[2581]: W0424 23:35:40.441212 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.441260 kubelet[2581]: E0424 23:35:40.441229 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.441759 kubelet[2581]: E0424 23:35:40.441739 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.441759 kubelet[2581]: W0424 23:35:40.441755 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.443283 kubelet[2581]: E0424 23:35:40.441770 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.443283 kubelet[2581]: E0424 23:35:40.442314 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.443283 kubelet[2581]: W0424 23:35:40.442327 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.443283 kubelet[2581]: E0424 23:35:40.442450 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.443283 kubelet[2581]: E0424 23:35:40.442904 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.443283 kubelet[2581]: W0424 23:35:40.443018 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.443283 kubelet[2581]: E0424 23:35:40.443039 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.443678 kubelet[2581]: E0424 23:35:40.443546 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.443678 kubelet[2581]: W0424 23:35:40.443674 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.443755 kubelet[2581]: E0424 23:35:40.443690 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.444223 kubelet[2581]: E0424 23:35:40.444198 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.444223 kubelet[2581]: W0424 23:35:40.444218 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.444312 kubelet[2581]: E0424 23:35:40.444232 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.444882 kubelet[2581]: E0424 23:35:40.444701 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.444882 kubelet[2581]: W0424 23:35:40.444716 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.444882 kubelet[2581]: E0424 23:35:40.444728 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.450862 kubelet[2581]: E0424 23:35:40.450828 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.452181 kubelet[2581]: W0424 23:35:40.451071 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.452181 kubelet[2581]: E0424 23:35:40.451105 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.452517 kubelet[2581]: E0424 23:35:40.452486 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.452517 kubelet[2581]: W0424 23:35:40.452510 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.452638 kubelet[2581]: E0424 23:35:40.452529 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.454711 kubelet[2581]: E0424 23:35:40.454689 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.454818 kubelet[2581]: W0424 23:35:40.454711 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.454818 kubelet[2581]: E0424 23:35:40.454732 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.455035 kubelet[2581]: E0424 23:35:40.455020 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.455035 kubelet[2581]: W0424 23:35:40.455033 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.455098 kubelet[2581]: E0424 23:35:40.455044 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.455249 kubelet[2581]: E0424 23:35:40.455234 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.455249 kubelet[2581]: W0424 23:35:40.455248 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.455315 kubelet[2581]: E0424 23:35:40.455259 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.455410 kubelet[2581]: I0424 23:35:40.455352 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25b74\" (UniqueName: \"kubernetes.io/projected/31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa-kube-api-access-25b74\") pod \"csi-node-driver-drdkn\" (UID: \"31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa\") " pod="calico-system/csi-node-driver-drdkn" Apr 24 23:35:40.455504 kubelet[2581]: E0424 23:35:40.455491 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.455534 kubelet[2581]: W0424 23:35:40.455504 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.455534 kubelet[2581]: E0424 23:35:40.455514 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.455760 kubelet[2581]: E0424 23:35:40.455745 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.455760 kubelet[2581]: W0424 23:35:40.455759 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.455831 kubelet[2581]: E0424 23:35:40.455769 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.456014 kubelet[2581]: E0424 23:35:40.455999 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.456014 kubelet[2581]: W0424 23:35:40.456013 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.456075 kubelet[2581]: E0424 23:35:40.456025 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.457735 kubelet[2581]: E0424 23:35:40.456643 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.457735 kubelet[2581]: W0424 23:35:40.456680 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.457735 kubelet[2581]: E0424 23:35:40.456692 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.457735 kubelet[2581]: E0424 23:35:40.457074 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.457735 kubelet[2581]: W0424 23:35:40.457085 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.457735 kubelet[2581]: E0424 23:35:40.457100 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.457976 kubelet[2581]: E0424 23:35:40.457956 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.457976 kubelet[2581]: W0424 23:35:40.457972 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.458030 kubelet[2581]: E0424 23:35:40.457984 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.458389 kubelet[2581]: E0424 23:35:40.458366 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.458389 kubelet[2581]: W0424 23:35:40.458385 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.458466 kubelet[2581]: E0424 23:35:40.458396 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.458756 kubelet[2581]: E0424 23:35:40.458736 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.458756 kubelet[2581]: W0424 23:35:40.458750 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.458829 kubelet[2581]: E0424 23:35:40.458761 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.459237 kubelet[2581]: E0424 23:35:40.459147 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.459237 kubelet[2581]: W0424 23:35:40.459174 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.459237 kubelet[2581]: E0424 23:35:40.459184 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.459341 kubelet[2581]: I0424 23:35:40.459271 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa-varrun\") pod \"csi-node-driver-drdkn\" (UID: \"31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa\") " pod="calico-system/csi-node-driver-drdkn" Apr 24 23:35:40.460081 kubelet[2581]: E0424 23:35:40.459514 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.460081 kubelet[2581]: W0424 23:35:40.459528 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.460081 kubelet[2581]: E0424 23:35:40.459538 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.460081 kubelet[2581]: E0424 23:35:40.459749 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.460081 kubelet[2581]: W0424 23:35:40.459758 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.460081 kubelet[2581]: E0424 23:35:40.459767 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.460081 kubelet[2581]: E0424 23:35:40.459992 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.460081 kubelet[2581]: W0424 23:35:40.460000 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.460081 kubelet[2581]: E0424 23:35:40.460009 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.460759 kubelet[2581]: E0424 23:35:40.460518 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.460759 kubelet[2581]: W0424 23:35:40.460663 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.460759 kubelet[2581]: E0424 23:35:40.460695 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.462373 kubelet[2581]: E0424 23:35:40.461076 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.462373 kubelet[2581]: W0424 23:35:40.461092 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.462373 kubelet[2581]: E0424 23:35:40.461103 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.462373 kubelet[2581]: E0424 23:35:40.461385 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.462373 kubelet[2581]: W0424 23:35:40.461394 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.462373 kubelet[2581]: E0424 23:35:40.461404 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.462373 kubelet[2581]: E0424 23:35:40.461638 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.462373 kubelet[2581]: W0424 23:35:40.461647 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.462373 kubelet[2581]: E0424 23:35:40.461656 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.462373 kubelet[2581]: E0424 23:35:40.461862 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.462666 kubelet[2581]: W0424 23:35:40.461872 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.462666 kubelet[2581]: E0424 23:35:40.461881 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.462666 kubelet[2581]: E0424 23:35:40.462045 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.462666 kubelet[2581]: W0424 23:35:40.462052 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.462666 kubelet[2581]: E0424 23:35:40.462060 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.462666 kubelet[2581]: E0424 23:35:40.462255 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.462666 kubelet[2581]: W0424 23:35:40.462263 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.462666 kubelet[2581]: E0424 23:35:40.462271 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.462666 kubelet[2581]: E0424 23:35:40.462415 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.462666 kubelet[2581]: W0424 23:35:40.462423 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.462878 kubelet[2581]: E0424 23:35:40.462430 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.462878 kubelet[2581]: E0424 23:35:40.462612 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.462878 kubelet[2581]: W0424 23:35:40.462624 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.462878 kubelet[2581]: E0424 23:35:40.462633 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.469478 systemd[1]: Started cri-containerd-635e94f01c8ea5ee3907d913c136d3e804129c389f922e58380029ff992bed88.scope - libcontainer container 635e94f01c8ea5ee3907d913c136d3e804129c389f922e58380029ff992bed88. Apr 24 23:35:40.505828 containerd[1462]: time="2026-04-24T23:35:40.505787982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-d98747f4b-nlzzk,Uid:5f0c11c3-4fbf-49a9-9e02-1bd577ad15f9,Namespace:calico-system,Attempt:0,} returns sandbox id \"635e94f01c8ea5ee3907d913c136d3e804129c389f922e58380029ff992bed88\"" Apr 24 23:35:40.506977 containerd[1462]: time="2026-04-24T23:35:40.506776335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ldsmt,Uid:8bcbed52-179b-4ac0-892c-2c0704a03414,Namespace:calico-system,Attempt:0,}" Apr 24 23:35:40.508440 containerd[1462]: time="2026-04-24T23:35:40.508372400Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Apr 24 23:35:40.533791 containerd[1462]: time="2026-04-24T23:35:40.533501157Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:40.533791 containerd[1462]: time="2026-04-24T23:35:40.533597748Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:40.533791 containerd[1462]: time="2026-04-24T23:35:40.533620875Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:40.534571 containerd[1462]: time="2026-04-24T23:35:40.534147002Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:40.556325 systemd[1]: Started cri-containerd-fca809d1a85db678180d95d0f5c3ef2fc923086d38ac81e6fb3ac689266ee6e0.scope - libcontainer container fca809d1a85db678180d95d0f5c3ef2fc923086d38ac81e6fb3ac689266ee6e0. Apr 24 23:35:40.563738 kubelet[2581]: E0424 23:35:40.563379 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.563738 kubelet[2581]: W0424 23:35:40.563419 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.563738 kubelet[2581]: E0424 23:35:40.563455 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.564984 kubelet[2581]: E0424 23:35:40.564802 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.564984 kubelet[2581]: W0424 23:35:40.564919 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.564984 kubelet[2581]: E0424 23:35:40.564942 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.565523 kubelet[2581]: E0424 23:35:40.565293 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.565523 kubelet[2581]: W0424 23:35:40.565313 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.565523 kubelet[2581]: E0424 23:35:40.565328 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.565523 kubelet[2581]: E0424 23:35:40.565512 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.565523 kubelet[2581]: W0424 23:35:40.565521 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.565523 kubelet[2581]: E0424 23:35:40.565530 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.565872 kubelet[2581]: E0424 23:35:40.565849 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.565872 kubelet[2581]: W0424 23:35:40.565861 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.566343 kubelet[2581]: E0424 23:35:40.565873 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.566698 kubelet[2581]: E0424 23:35:40.566478 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.566698 kubelet[2581]: W0424 23:35:40.566493 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.566698 kubelet[2581]: E0424 23:35:40.566508 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.567367 kubelet[2581]: E0424 23:35:40.567346 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.567367 kubelet[2581]: W0424 23:35:40.567364 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.567441 kubelet[2581]: E0424 23:35:40.567379 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.567862 kubelet[2581]: E0424 23:35:40.567577 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.567862 kubelet[2581]: W0424 23:35:40.567589 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.567862 kubelet[2581]: E0424 23:35:40.567615 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.567862 kubelet[2581]: E0424 23:35:40.567854 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.567862 kubelet[2581]: W0424 23:35:40.567865 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.568027 kubelet[2581]: E0424 23:35:40.567877 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.568846 kubelet[2581]: E0424 23:35:40.568058 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.568846 kubelet[2581]: W0424 23:35:40.568073 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.568846 kubelet[2581]: E0424 23:35:40.568084 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.568846 kubelet[2581]: E0424 23:35:40.568264 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.568846 kubelet[2581]: W0424 23:35:40.568273 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.568846 kubelet[2581]: E0424 23:35:40.568282 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.568846 kubelet[2581]: E0424 23:35:40.568483 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.568846 kubelet[2581]: W0424 23:35:40.568492 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.568846 kubelet[2581]: E0424 23:35:40.568501 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.568846 kubelet[2581]: E0424 23:35:40.568734 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.569367 kubelet[2581]: W0424 23:35:40.568746 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.569367 kubelet[2581]: E0424 23:35:40.568759 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.569515 kubelet[2581]: E0424 23:35:40.569496 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.569550 kubelet[2581]: W0424 23:35:40.569516 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.569550 kubelet[2581]: E0424 23:35:40.569532 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.570364 kubelet[2581]: E0424 23:35:40.570344 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.570364 kubelet[2581]: W0424 23:35:40.570363 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.570493 kubelet[2581]: E0424 23:35:40.570377 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.570917 kubelet[2581]: E0424 23:35:40.570888 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.570917 kubelet[2581]: W0424 23:35:40.570906 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.570917 kubelet[2581]: E0424 23:35:40.570918 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.571253 kubelet[2581]: E0424 23:35:40.571238 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.571253 kubelet[2581]: W0424 23:35:40.571252 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.571311 kubelet[2581]: E0424 23:35:40.571262 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.571822 kubelet[2581]: E0424 23:35:40.571801 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.571822 kubelet[2581]: W0424 23:35:40.571821 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.571913 kubelet[2581]: E0424 23:35:40.571834 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.572233 kubelet[2581]: E0424 23:35:40.572218 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.572233 kubelet[2581]: W0424 23:35:40.572231 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.572312 kubelet[2581]: E0424 23:35:40.572242 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.573632 kubelet[2581]: E0424 23:35:40.573587 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.573719 kubelet[2581]: W0424 23:35:40.573636 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.573719 kubelet[2581]: E0424 23:35:40.573652 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.574158 kubelet[2581]: E0424 23:35:40.574016 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.574158 kubelet[2581]: W0424 23:35:40.574031 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.574158 kubelet[2581]: E0424 23:35:40.574042 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.574577 kubelet[2581]: E0424 23:35:40.574332 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.574577 kubelet[2581]: W0424 23:35:40.574343 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.574577 kubelet[2581]: E0424 23:35:40.574353 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.574577 kubelet[2581]: E0424 23:35:40.574578 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.574760 kubelet[2581]: W0424 23:35:40.574587 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.574760 kubelet[2581]: E0424 23:35:40.574596 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.575907 kubelet[2581]: E0424 23:35:40.575785 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.575907 kubelet[2581]: W0424 23:35:40.575834 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.575907 kubelet[2581]: E0424 23:35:40.575847 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.577929 kubelet[2581]: E0424 23:35:40.576294 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.577929 kubelet[2581]: W0424 23:35:40.576351 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.577929 kubelet[2581]: E0424 23:35:40.576365 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:40.593215 containerd[1462]: time="2026-04-24T23:35:40.592994316Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ldsmt,Uid:8bcbed52-179b-4ac0-892c-2c0704a03414,Namespace:calico-system,Attempt:0,} returns sandbox id \"fca809d1a85db678180d95d0f5c3ef2fc923086d38ac81e6fb3ac689266ee6e0\"" Apr 24 23:35:40.594260 kubelet[2581]: E0424 23:35:40.594235 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:40.594389 kubelet[2581]: W0424 23:35:40.594373 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:40.594476 kubelet[2581]: E0424 23:35:40.594463 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:42.038129 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2432081293.mount: Deactivated successfully. Apr 24 23:35:42.111884 kubelet[2581]: E0424 23:35:42.111516 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drdkn" podUID="31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa" Apr 24 23:35:42.994628 containerd[1462]: time="2026-04-24T23:35:42.993386697Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:42.994628 containerd[1462]: time="2026-04-24T23:35:42.994572438Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=33865174" Apr 24 23:35:42.995555 containerd[1462]: time="2026-04-24T23:35:42.995483780Z" level=info msg="ImageCreate event name:\"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:42.998640 containerd[1462]: time="2026-04-24T23:35:42.998550621Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:43.000157 containerd[1462]: time="2026-04-24T23:35:42.999934019Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"33865028\" in 2.491515845s" Apr 24 23:35:43.000157 containerd[1462]: time="2026-04-24T23:35:42.999994716Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:e836e1dea560d4c477b347f1c93c245aec618361306b23eda1d6bb7665476182\"" Apr 24 23:35:43.002390 containerd[1462]: time="2026-04-24T23:35:43.002274836Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Apr 24 23:35:43.029640 containerd[1462]: time="2026-04-24T23:35:43.029458406Z" level=info msg="CreateContainer within sandbox \"635e94f01c8ea5ee3907d913c136d3e804129c389f922e58380029ff992bed88\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Apr 24 23:35:43.049698 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2643351435.mount: Deactivated successfully. Apr 24 23:35:43.054199 containerd[1462]: time="2026-04-24T23:35:43.054153015Z" level=info msg="CreateContainer within sandbox \"635e94f01c8ea5ee3907d913c136d3e804129c389f922e58380029ff992bed88\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"3e1aaa60d7bbcca76733cfc1dca8535b0972e2b5505b78c84a07afefd51828cb\"" Apr 24 23:35:43.056183 containerd[1462]: time="2026-04-24T23:35:43.056040972Z" level=info msg="StartContainer for \"3e1aaa60d7bbcca76733cfc1dca8535b0972e2b5505b78c84a07afefd51828cb\"" Apr 24 23:35:43.088699 systemd[1]: Started cri-containerd-3e1aaa60d7bbcca76733cfc1dca8535b0972e2b5505b78c84a07afefd51828cb.scope - libcontainer container 3e1aaa60d7bbcca76733cfc1dca8535b0972e2b5505b78c84a07afefd51828cb. Apr 24 23:35:43.128997 containerd[1462]: time="2026-04-24T23:35:43.128262806Z" level=info msg="StartContainer for \"3e1aaa60d7bbcca76733cfc1dca8535b0972e2b5505b78c84a07afefd51828cb\" returns successfully" Apr 24 23:35:43.310198 kubelet[2581]: E0424 23:35:43.310076 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.310198 kubelet[2581]: W0424 23:35:43.310104 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.310723 kubelet[2581]: E0424 23:35:43.310179 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.311076 kubelet[2581]: E0424 23:35:43.311046 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.311076 kubelet[2581]: W0424 23:35:43.311066 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.311203 kubelet[2581]: E0424 23:35:43.311085 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.312510 kubelet[2581]: E0424 23:35:43.312469 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.312510 kubelet[2581]: W0424 23:35:43.312490 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.312632 kubelet[2581]: E0424 23:35:43.312507 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.312746 kubelet[2581]: E0424 23:35:43.312728 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.312746 kubelet[2581]: W0424 23:35:43.312740 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.312816 kubelet[2581]: E0424 23:35:43.312750 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.312983 kubelet[2581]: E0424 23:35:43.312963 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.312983 kubelet[2581]: W0424 23:35:43.312977 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.313046 kubelet[2581]: E0424 23:35:43.312987 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.314277 kubelet[2581]: E0424 23:35:43.314251 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.314277 kubelet[2581]: W0424 23:35:43.314271 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.314458 kubelet[2581]: E0424 23:35:43.314289 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.314658 kubelet[2581]: E0424 23:35:43.314635 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.314658 kubelet[2581]: W0424 23:35:43.314651 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.314732 kubelet[2581]: E0424 23:35:43.314661 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.314880 kubelet[2581]: E0424 23:35:43.314864 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.314880 kubelet[2581]: W0424 23:35:43.314876 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.314955 kubelet[2581]: E0424 23:35:43.314886 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.315107 kubelet[2581]: E0424 23:35:43.315089 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.315107 kubelet[2581]: W0424 23:35:43.315101 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.315175 kubelet[2581]: E0424 23:35:43.315110 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.316260 kubelet[2581]: E0424 23:35:43.316237 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.316353 kubelet[2581]: W0424 23:35:43.316267 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.316353 kubelet[2581]: E0424 23:35:43.316281 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.316689 kubelet[2581]: E0424 23:35:43.316670 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.316755 kubelet[2581]: W0424 23:35:43.316718 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.316755 kubelet[2581]: E0424 23:35:43.316732 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.317054 kubelet[2581]: E0424 23:35:43.317027 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.317054 kubelet[2581]: W0424 23:35:43.317048 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.317185 kubelet[2581]: E0424 23:35:43.317059 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.318081 kubelet[2581]: E0424 23:35:43.318057 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.318081 kubelet[2581]: W0424 23:35:43.318075 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.318230 kubelet[2581]: E0424 23:35:43.318087 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.319010 kubelet[2581]: E0424 23:35:43.318968 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.319010 kubelet[2581]: W0424 23:35:43.318996 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.319010 kubelet[2581]: E0424 23:35:43.319011 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.319559 kubelet[2581]: E0424 23:35:43.319258 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.319559 kubelet[2581]: W0424 23:35:43.319273 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.319559 kubelet[2581]: E0424 23:35:43.319283 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.392724 kubelet[2581]: E0424 23:35:43.392688 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.392724 kubelet[2581]: W0424 23:35:43.392713 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.392724 kubelet[2581]: E0424 23:35:43.392733 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.393661 kubelet[2581]: E0424 23:35:43.393445 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.393661 kubelet[2581]: W0424 23:35:43.393466 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.393661 kubelet[2581]: E0424 23:35:43.393481 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.393804 kubelet[2581]: E0424 23:35:43.393785 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.393804 kubelet[2581]: W0424 23:35:43.393797 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.395306 kubelet[2581]: E0424 23:35:43.393813 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.395491 kubelet[2581]: E0424 23:35:43.395472 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.395491 kubelet[2581]: W0424 23:35:43.395489 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.395551 kubelet[2581]: E0424 23:35:43.395504 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.396365 kubelet[2581]: E0424 23:35:43.396342 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.396365 kubelet[2581]: W0424 23:35:43.396363 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.396514 kubelet[2581]: E0424 23:35:43.396376 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.396720 kubelet[2581]: E0424 23:35:43.396606 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.396720 kubelet[2581]: W0424 23:35:43.396615 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.396720 kubelet[2581]: E0424 23:35:43.396625 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.396872 kubelet[2581]: E0424 23:35:43.396858 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.396872 kubelet[2581]: W0424 23:35:43.396869 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.396936 kubelet[2581]: E0424 23:35:43.396878 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.397042 kubelet[2581]: E0424 23:35:43.397030 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.397042 kubelet[2581]: W0424 23:35:43.397040 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.397104 kubelet[2581]: E0424 23:35:43.397048 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.397466 kubelet[2581]: E0424 23:35:43.397445 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.397466 kubelet[2581]: W0424 23:35:43.397458 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.402149 kubelet[2581]: E0424 23:35:43.398383 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.402661 kubelet[2581]: E0424 23:35:43.402641 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.402661 kubelet[2581]: W0424 23:35:43.402660 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.402744 kubelet[2581]: E0424 23:35:43.402678 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.402965 kubelet[2581]: E0424 23:35:43.402952 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.403002 kubelet[2581]: W0424 23:35:43.402965 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.403002 kubelet[2581]: E0424 23:35:43.402976 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.403175 kubelet[2581]: E0424 23:35:43.403161 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.403175 kubelet[2581]: W0424 23:35:43.403173 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.403276 kubelet[2581]: E0424 23:35:43.403183 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.403349 kubelet[2581]: E0424 23:35:43.403334 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.403349 kubelet[2581]: W0424 23:35:43.403345 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.403521 kubelet[2581]: E0424 23:35:43.403354 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.403740 kubelet[2581]: E0424 23:35:43.403605 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.403740 kubelet[2581]: W0424 23:35:43.403616 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.403740 kubelet[2581]: E0424 23:35:43.403625 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.403986 kubelet[2581]: E0424 23:35:43.403972 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.403986 kubelet[2581]: W0424 23:35:43.403984 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.404051 kubelet[2581]: E0424 23:35:43.403994 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.404645 kubelet[2581]: E0424 23:35:43.404627 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.404645 kubelet[2581]: W0424 23:35:43.404643 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.404708 kubelet[2581]: E0424 23:35:43.404655 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.405231 kubelet[2581]: E0424 23:35:43.405084 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.405231 kubelet[2581]: W0424 23:35:43.405100 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.405231 kubelet[2581]: E0424 23:35:43.405125 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:43.405497 kubelet[2581]: E0424 23:35:43.405391 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:43.405497 kubelet[2581]: W0424 23:35:43.405404 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:43.405497 kubelet[2581]: E0424 23:35:43.405468 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.113152 kubelet[2581]: E0424 23:35:44.111430 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drdkn" podUID="31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa" Apr 24 23:35:44.225695 kubelet[2581]: I0424 23:35:44.225651 2581 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:35:44.325751 kubelet[2581]: E0424 23:35:44.325697 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.325751 kubelet[2581]: W0424 23:35:44.325725 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.325751 kubelet[2581]: E0424 23:35:44.325748 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.326339 kubelet[2581]: E0424 23:35:44.325985 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.326339 kubelet[2581]: W0424 23:35:44.325995 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.326339 kubelet[2581]: E0424 23:35:44.326006 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.326339 kubelet[2581]: E0424 23:35:44.326200 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.326339 kubelet[2581]: W0424 23:35:44.326209 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.326339 kubelet[2581]: E0424 23:35:44.326219 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.326621 kubelet[2581]: E0424 23:35:44.326374 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.326621 kubelet[2581]: W0424 23:35:44.326386 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.326621 kubelet[2581]: E0424 23:35:44.326395 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.326621 kubelet[2581]: E0424 23:35:44.326558 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.326621 kubelet[2581]: W0424 23:35:44.326565 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.326621 kubelet[2581]: E0424 23:35:44.326574 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.326882 kubelet[2581]: E0424 23:35:44.326730 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.326882 kubelet[2581]: W0424 23:35:44.326739 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.326882 kubelet[2581]: E0424 23:35:44.326747 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.326977 kubelet[2581]: E0424 23:35:44.326890 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.326977 kubelet[2581]: W0424 23:35:44.326898 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.326977 kubelet[2581]: E0424 23:35:44.326906 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.327389 kubelet[2581]: E0424 23:35:44.327095 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.327389 kubelet[2581]: W0424 23:35:44.327110 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.327389 kubelet[2581]: E0424 23:35:44.327141 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.327389 kubelet[2581]: E0424 23:35:44.327369 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.327389 kubelet[2581]: W0424 23:35:44.327379 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.327389 kubelet[2581]: E0424 23:35:44.327388 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.327631 kubelet[2581]: E0424 23:35:44.327537 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.327631 kubelet[2581]: W0424 23:35:44.327544 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.327631 kubelet[2581]: E0424 23:35:44.327552 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.327934 kubelet[2581]: E0424 23:35:44.327734 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.327934 kubelet[2581]: W0424 23:35:44.327744 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.327934 kubelet[2581]: E0424 23:35:44.327753 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.328037 kubelet[2581]: E0424 23:35:44.327995 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.328037 kubelet[2581]: W0424 23:35:44.328004 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.328037 kubelet[2581]: E0424 23:35:44.328013 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.328688 kubelet[2581]: E0424 23:35:44.328249 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.328688 kubelet[2581]: W0424 23:35:44.328259 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.328688 kubelet[2581]: E0424 23:35:44.328269 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.328688 kubelet[2581]: E0424 23:35:44.328431 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.328688 kubelet[2581]: W0424 23:35:44.328438 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.328688 kubelet[2581]: E0424 23:35:44.328446 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.328688 kubelet[2581]: E0424 23:35:44.328575 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.328688 kubelet[2581]: W0424 23:35:44.328584 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.328688 kubelet[2581]: E0424 23:35:44.328592 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.405824 kubelet[2581]: E0424 23:35:44.405717 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.405824 kubelet[2581]: W0424 23:35:44.405744 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.405824 kubelet[2581]: E0424 23:35:44.405767 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.408719 kubelet[2581]: E0424 23:35:44.408332 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.408719 kubelet[2581]: W0424 23:35:44.408356 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.408719 kubelet[2581]: E0424 23:35:44.408374 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.409191 kubelet[2581]: E0424 23:35:44.409060 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.409191 kubelet[2581]: W0424 23:35:44.409079 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.409191 kubelet[2581]: E0424 23:35:44.409094 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.409898 kubelet[2581]: E0424 23:35:44.409748 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.409898 kubelet[2581]: W0424 23:35:44.409764 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.409898 kubelet[2581]: E0424 23:35:44.409777 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.410318 kubelet[2581]: E0424 23:35:44.410234 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.410318 kubelet[2581]: W0424 23:35:44.410256 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.410318 kubelet[2581]: E0424 23:35:44.410269 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.411519 kubelet[2581]: E0424 23:35:44.411195 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.411519 kubelet[2581]: W0424 23:35:44.411213 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.411519 kubelet[2581]: E0424 23:35:44.411228 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.412241 kubelet[2581]: E0424 23:35:44.412220 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.412341 kubelet[2581]: W0424 23:35:44.412311 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.412341 kubelet[2581]: E0424 23:35:44.412329 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.413378 kubelet[2581]: E0424 23:35:44.413212 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.413378 kubelet[2581]: W0424 23:35:44.413231 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.413378 kubelet[2581]: E0424 23:35:44.413243 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.414080 kubelet[2581]: E0424 23:35:44.414043 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.414080 kubelet[2581]: W0424 23:35:44.414056 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.414080 kubelet[2581]: E0424 23:35:44.414067 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.415420 kubelet[2581]: E0424 23:35:44.415228 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.415420 kubelet[2581]: W0424 23:35:44.415242 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.415420 kubelet[2581]: E0424 23:35:44.415255 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.416066 kubelet[2581]: E0424 23:35:44.415935 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.416066 kubelet[2581]: W0424 23:35:44.415949 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.416066 kubelet[2581]: E0424 23:35:44.415960 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.416607 kubelet[2581]: E0424 23:35:44.416507 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.416607 kubelet[2581]: W0424 23:35:44.416519 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.416607 kubelet[2581]: E0424 23:35:44.416545 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.417219 kubelet[2581]: E0424 23:35:44.417205 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.417303 kubelet[2581]: W0424 23:35:44.417278 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.417303 kubelet[2581]: E0424 23:35:44.417293 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.418332 kubelet[2581]: E0424 23:35:44.417961 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.418332 kubelet[2581]: W0424 23:35:44.417974 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.418332 kubelet[2581]: E0424 23:35:44.417985 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.418626 kubelet[2581]: E0424 23:35:44.418491 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.418626 kubelet[2581]: W0424 23:35:44.418503 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.418626 kubelet[2581]: E0424 23:35:44.418513 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.420164 kubelet[2581]: E0424 23:35:44.419990 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.420164 kubelet[2581]: W0424 23:35:44.420004 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.420164 kubelet[2581]: E0424 23:35:44.420014 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.420459 kubelet[2581]: E0424 23:35:44.420413 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.420459 kubelet[2581]: W0424 23:35:44.420425 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.420459 kubelet[2581]: E0424 23:35:44.420459 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.420870 kubelet[2581]: E0424 23:35:44.420775 2581 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Apr 24 23:35:44.420870 kubelet[2581]: W0424 23:35:44.420788 2581 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Apr 24 23:35:44.420870 kubelet[2581]: E0424 23:35:44.420798 2581 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Apr 24 23:35:44.471482 containerd[1462]: time="2026-04-24T23:35:44.471411547Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:44.473148 containerd[1462]: time="2026-04-24T23:35:44.472988719Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4457682" Apr 24 23:35:44.474439 containerd[1462]: time="2026-04-24T23:35:44.474360678Z" level=info msg="ImageCreate event name:\"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:44.478804 containerd[1462]: time="2026-04-24T23:35:44.477880839Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:44.478804 containerd[1462]: time="2026-04-24T23:35:44.478672806Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"5855167\" in 1.476287941s" Apr 24 23:35:44.478804 containerd[1462]: time="2026-04-24T23:35:44.478715818Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:449a6463eaa02e13b190ef7c4057191febcc65ab9418bae3bc0995f5bce65798\"" Apr 24 23:35:44.485619 containerd[1462]: time="2026-04-24T23:35:44.485582534Z" level=info msg="CreateContainer within sandbox \"fca809d1a85db678180d95d0f5c3ef2fc923086d38ac81e6fb3ac689266ee6e0\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Apr 24 23:35:44.501592 containerd[1462]: time="2026-04-24T23:35:44.501546110Z" level=info msg="CreateContainer within sandbox \"fca809d1a85db678180d95d0f5c3ef2fc923086d38ac81e6fb3ac689266ee6e0\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0b9d0e14b60eb5a70200398dd6347012fd64bd316f5bac1d7ef286f6064fa23d\"" Apr 24 23:35:44.502773 containerd[1462]: time="2026-04-24T23:35:44.502706134Z" level=info msg="StartContainer for \"0b9d0e14b60eb5a70200398dd6347012fd64bd316f5bac1d7ef286f6064fa23d\"" Apr 24 23:35:44.537334 systemd[1]: Started cri-containerd-0b9d0e14b60eb5a70200398dd6347012fd64bd316f5bac1d7ef286f6064fa23d.scope - libcontainer container 0b9d0e14b60eb5a70200398dd6347012fd64bd316f5bac1d7ef286f6064fa23d. Apr 24 23:35:44.573837 containerd[1462]: time="2026-04-24T23:35:44.573796412Z" level=info msg="StartContainer for \"0b9d0e14b60eb5a70200398dd6347012fd64bd316f5bac1d7ef286f6064fa23d\" returns successfully" Apr 24 23:35:44.593289 systemd[1]: cri-containerd-0b9d0e14b60eb5a70200398dd6347012fd64bd316f5bac1d7ef286f6064fa23d.scope: Deactivated successfully. Apr 24 23:35:44.621462 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0b9d0e14b60eb5a70200398dd6347012fd64bd316f5bac1d7ef286f6064fa23d-rootfs.mount: Deactivated successfully. Apr 24 23:35:44.777656 containerd[1462]: time="2026-04-24T23:35:44.777580725Z" level=info msg="shim disconnected" id=0b9d0e14b60eb5a70200398dd6347012fd64bd316f5bac1d7ef286f6064fa23d namespace=k8s.io Apr 24 23:35:44.777656 containerd[1462]: time="2026-04-24T23:35:44.777655464Z" level=warning msg="cleaning up after shim disconnected" id=0b9d0e14b60eb5a70200398dd6347012fd64bd316f5bac1d7ef286f6064fa23d namespace=k8s.io Apr 24 23:35:44.777965 containerd[1462]: time="2026-04-24T23:35:44.777669028Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:35:45.233165 containerd[1462]: time="2026-04-24T23:35:45.233102269Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Apr 24 23:35:45.262936 kubelet[2581]: I0424 23:35:45.262477 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-d98747f4b-nlzzk" podStartSLOduration=2.768883635 podStartE2EDuration="5.262462687s" podCreationTimestamp="2026-04-24 23:35:40 +0000 UTC" firstStartedPulling="2026-04-24 23:35:40.507789176 +0000 UTC m=+20.551347492" lastFinishedPulling="2026-04-24 23:35:43.001368188 +0000 UTC m=+23.044926544" observedRunningTime="2026-04-24 23:35:43.249336949 +0000 UTC m=+23.292895265" watchObservedRunningTime="2026-04-24 23:35:45.262462687 +0000 UTC m=+25.306021043" Apr 24 23:35:46.116478 kubelet[2581]: E0424 23:35:46.115854 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drdkn" podUID="31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa" Apr 24 23:35:46.357783 kubelet[2581]: I0424 23:35:46.356967 2581 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:35:48.112451 kubelet[2581]: E0424 23:35:48.112414 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drdkn" podUID="31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa" Apr 24 23:35:50.111808 kubelet[2581]: E0424 23:35:50.111304 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drdkn" podUID="31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa" Apr 24 23:35:50.368490 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2409690236.mount: Deactivated successfully. Apr 24 23:35:50.400253 containerd[1462]: time="2026-04-24T23:35:50.400165504Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:50.401964 containerd[1462]: time="2026-04-24T23:35:50.401912136Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=153921674" Apr 24 23:35:50.403251 containerd[1462]: time="2026-04-24T23:35:50.403193634Z" level=info msg="ImageCreate event name:\"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:50.407242 containerd[1462]: time="2026-04-24T23:35:50.406409202Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:50.407242 containerd[1462]: time="2026-04-24T23:35:50.407088259Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"153921536\" in 5.173919135s" Apr 24 23:35:50.407242 containerd[1462]: time="2026-04-24T23:35:50.407131828Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:27be54f2b9e47d96c7e9e5ad16e26ec298c1829f31885c81a622d50472c8ac97\"" Apr 24 23:35:50.416047 containerd[1462]: time="2026-04-24T23:35:50.415963048Z" level=info msg="CreateContainer within sandbox \"fca809d1a85db678180d95d0f5c3ef2fc923086d38ac81e6fb3ac689266ee6e0\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Apr 24 23:35:50.436387 containerd[1462]: time="2026-04-24T23:35:50.436253019Z" level=info msg="CreateContainer within sandbox \"fca809d1a85db678180d95d0f5c3ef2fc923086d38ac81e6fb3ac689266ee6e0\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"c7decc79e305fcfd7035f4c2dc78c41dd228d0192f2e061a4b28eacdcacddf29\"" Apr 24 23:35:50.439259 containerd[1462]: time="2026-04-24T23:35:50.437098469Z" level=info msg="StartContainer for \"c7decc79e305fcfd7035f4c2dc78c41dd228d0192f2e061a4b28eacdcacddf29\"" Apr 24 23:35:50.491523 systemd[1]: Started cri-containerd-c7decc79e305fcfd7035f4c2dc78c41dd228d0192f2e061a4b28eacdcacddf29.scope - libcontainer container c7decc79e305fcfd7035f4c2dc78c41dd228d0192f2e061a4b28eacdcacddf29. Apr 24 23:35:50.531165 containerd[1462]: time="2026-04-24T23:35:50.528727502Z" level=info msg="StartContainer for \"c7decc79e305fcfd7035f4c2dc78c41dd228d0192f2e061a4b28eacdcacddf29\" returns successfully" Apr 24 23:35:50.646723 systemd[1]: cri-containerd-c7decc79e305fcfd7035f4c2dc78c41dd228d0192f2e061a4b28eacdcacddf29.scope: Deactivated successfully. Apr 24 23:35:50.885827 containerd[1462]: time="2026-04-24T23:35:50.885737997Z" level=info msg="shim disconnected" id=c7decc79e305fcfd7035f4c2dc78c41dd228d0192f2e061a4b28eacdcacddf29 namespace=k8s.io Apr 24 23:35:50.885827 containerd[1462]: time="2026-04-24T23:35:50.885799849Z" level=warning msg="cleaning up after shim disconnected" id=c7decc79e305fcfd7035f4c2dc78c41dd228d0192f2e061a4b28eacdcacddf29 namespace=k8s.io Apr 24 23:35:50.885827 containerd[1462]: time="2026-04-24T23:35:50.885809331Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:35:51.254419 containerd[1462]: time="2026-04-24T23:35:51.253779360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Apr 24 23:35:51.369705 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c7decc79e305fcfd7035f4c2dc78c41dd228d0192f2e061a4b28eacdcacddf29-rootfs.mount: Deactivated successfully. Apr 24 23:35:52.111760 kubelet[2581]: E0424 23:35:52.111694 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drdkn" podUID="31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa" Apr 24 23:35:53.971990 containerd[1462]: time="2026-04-24T23:35:53.971920522Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:53.973686 containerd[1462]: time="2026-04-24T23:35:53.973597303Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=66009216" Apr 24 23:35:53.974916 containerd[1462]: time="2026-04-24T23:35:53.974614125Z" level=info msg="ImageCreate event name:\"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:53.979164 containerd[1462]: time="2026-04-24T23:35:53.977987450Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:35:53.981099 containerd[1462]: time="2026-04-24T23:35:53.981039557Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"67406741\" in 2.727213829s" Apr 24 23:35:53.981099 containerd[1462]: time="2026-04-24T23:35:53.981097647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c10bed152367fad8c19e9400f12b748d6fbc20498086983df13e70e36f24511b\"" Apr 24 23:35:53.997370 containerd[1462]: time="2026-04-24T23:35:53.997163007Z" level=info msg="CreateContainer within sandbox \"fca809d1a85db678180d95d0f5c3ef2fc923086d38ac81e6fb3ac689266ee6e0\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Apr 24 23:35:54.014650 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3298306820.mount: Deactivated successfully. Apr 24 23:35:54.016641 containerd[1462]: time="2026-04-24T23:35:54.016560183Z" level=info msg="CreateContainer within sandbox \"fca809d1a85db678180d95d0f5c3ef2fc923086d38ac81e6fb3ac689266ee6e0\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"ebf7ffed2bcf21c733e45cf0f79fe0fbd529dde4fff85bf3e8675223779f8762\"" Apr 24 23:35:54.018213 containerd[1462]: time="2026-04-24T23:35:54.017537632Z" level=info msg="StartContainer for \"ebf7ffed2bcf21c733e45cf0f79fe0fbd529dde4fff85bf3e8675223779f8762\"" Apr 24 23:35:54.059584 systemd[1]: Started cri-containerd-ebf7ffed2bcf21c733e45cf0f79fe0fbd529dde4fff85bf3e8675223779f8762.scope - libcontainer container ebf7ffed2bcf21c733e45cf0f79fe0fbd529dde4fff85bf3e8675223779f8762. Apr 24 23:35:54.111006 kubelet[2581]: E0424 23:35:54.110948 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-drdkn" podUID="31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa" Apr 24 23:35:54.131951 containerd[1462]: time="2026-04-24T23:35:54.131878582Z" level=info msg="StartContainer for \"ebf7ffed2bcf21c733e45cf0f79fe0fbd529dde4fff85bf3e8675223779f8762\" returns successfully" Apr 24 23:35:54.748093 containerd[1462]: time="2026-04-24T23:35:54.748001485Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Apr 24 23:35:54.753247 systemd[1]: cri-containerd-ebf7ffed2bcf21c733e45cf0f79fe0fbd529dde4fff85bf3e8675223779f8762.scope: Deactivated successfully. Apr 24 23:35:54.781786 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ebf7ffed2bcf21c733e45cf0f79fe0fbd529dde4fff85bf3e8675223779f8762-rootfs.mount: Deactivated successfully. Apr 24 23:35:54.796009 kubelet[2581]: I0424 23:35:54.795975 2581 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Apr 24 23:35:54.901004 containerd[1462]: time="2026-04-24T23:35:54.900924579Z" level=info msg="shim disconnected" id=ebf7ffed2bcf21c733e45cf0f79fe0fbd529dde4fff85bf3e8675223779f8762 namespace=k8s.io Apr 24 23:35:54.901004 containerd[1462]: time="2026-04-24T23:35:54.900997471Z" level=warning msg="cleaning up after shim disconnected" id=ebf7ffed2bcf21c733e45cf0f79fe0fbd529dde4fff85bf3e8675223779f8762 namespace=k8s.io Apr 24 23:35:54.901004 containerd[1462]: time="2026-04-24T23:35:54.901007033Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:35:54.906181 systemd[1]: Created slice kubepods-burstable-pod54520bb9_f644_497d_8cfe_b9f4367c3def.slice - libcontainer container kubepods-burstable-pod54520bb9_f644_497d_8cfe_b9f4367c3def.slice. Apr 24 23:35:54.920359 systemd[1]: Created slice kubepods-besteffort-pod6dca57dd_795d_4761_9e86_5a637e03b75c.slice - libcontainer container kubepods-besteffort-pod6dca57dd_795d_4761_9e86_5a637e03b75c.slice. Apr 24 23:35:54.935791 systemd[1]: Created slice kubepods-besteffort-poda67a5a23_28c6_44c6_8e44_b199e7391183.slice - libcontainer container kubepods-besteffort-poda67a5a23_28c6_44c6_8e44_b199e7391183.slice. Apr 24 23:35:54.945462 containerd[1462]: time="2026-04-24T23:35:54.945402942Z" level=warning msg="cleanup warnings time=\"2026-04-24T23:35:54Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Apr 24 23:35:54.950509 systemd[1]: Created slice kubepods-besteffort-pod318078f6_c2fc_4219_8c64_82b649d6416b.slice - libcontainer container kubepods-besteffort-pod318078f6_c2fc_4219_8c64_82b649d6416b.slice. Apr 24 23:35:54.959769 systemd[1]: Created slice kubepods-burstable-pod7e35c05e_9f62_47d8_9c32_7f4c38496c65.slice - libcontainer container kubepods-burstable-pod7e35c05e_9f62_47d8_9c32_7f4c38496c65.slice. Apr 24 23:35:54.968842 systemd[1]: Created slice kubepods-besteffort-podad82956a_a9d5_482d_9feb_ead05df18123.slice - libcontainer container kubepods-besteffort-podad82956a_a9d5_482d_9feb_ead05df18123.slice. Apr 24 23:35:54.978644 systemd[1]: Created slice kubepods-besteffort-pod97bad9ab_01a6_4cd6_89bf_ca7dc9310125.slice - libcontainer container kubepods-besteffort-pod97bad9ab_01a6_4cd6_89bf_ca7dc9310125.slice. Apr 24 23:35:54.990330 kubelet[2581]: I0424 23:35:54.990216 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q98rz\" (UniqueName: \"kubernetes.io/projected/6dca57dd-795d-4761-9e86-5a637e03b75c-kube-api-access-q98rz\") pod \"calico-apiserver-ccf7f6874-6drmn\" (UID: \"6dca57dd-795d-4761-9e86-5a637e03b75c\") " pod="calico-system/calico-apiserver-ccf7f6874-6drmn" Apr 24 23:35:54.990625 kubelet[2581]: I0424 23:35:54.990332 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/a67a5a23-28c6-44c6-8e44-b199e7391183-goldmane-key-pair\") pod \"goldmane-9f7667bb8-9xhf6\" (UID: \"a67a5a23-28c6-44c6-8e44-b199e7391183\") " pod="calico-system/goldmane-9f7667bb8-9xhf6" Apr 24 23:35:54.990625 kubelet[2581]: I0424 23:35:54.990401 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhlzz\" (UniqueName: \"kubernetes.io/projected/54520bb9-f644-497d-8cfe-b9f4367c3def-kube-api-access-nhlzz\") pod \"coredns-7d764666f9-lnj9t\" (UID: \"54520bb9-f644-497d-8cfe-b9f4367c3def\") " pod="kube-system/coredns-7d764666f9-lnj9t" Apr 24 23:35:54.991000 kubelet[2581]: I0424 23:35:54.990763 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a67a5a23-28c6-44c6-8e44-b199e7391183-config\") pod \"goldmane-9f7667bb8-9xhf6\" (UID: \"a67a5a23-28c6-44c6-8e44-b199e7391183\") " pod="calico-system/goldmane-9f7667bb8-9xhf6" Apr 24 23:35:54.991000 kubelet[2581]: I0424 23:35:54.990885 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6dca57dd-795d-4761-9e86-5a637e03b75c-calico-apiserver-certs\") pod \"calico-apiserver-ccf7f6874-6drmn\" (UID: \"6dca57dd-795d-4761-9e86-5a637e03b75c\") " pod="calico-system/calico-apiserver-ccf7f6874-6drmn" Apr 24 23:35:54.991000 kubelet[2581]: I0424 23:35:54.990970 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67a5a23-28c6-44c6-8e44-b199e7391183-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-9xhf6\" (UID: \"a67a5a23-28c6-44c6-8e44-b199e7391183\") " pod="calico-system/goldmane-9f7667bb8-9xhf6" Apr 24 23:35:54.991407 kubelet[2581]: I0424 23:35:54.991256 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctmmv\" (UniqueName: \"kubernetes.io/projected/a67a5a23-28c6-44c6-8e44-b199e7391183-kube-api-access-ctmmv\") pod \"goldmane-9f7667bb8-9xhf6\" (UID: \"a67a5a23-28c6-44c6-8e44-b199e7391183\") " pod="calico-system/goldmane-9f7667bb8-9xhf6" Apr 24 23:35:54.991407 kubelet[2581]: I0424 23:35:54.991336 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54520bb9-f644-497d-8cfe-b9f4367c3def-config-volume\") pod \"coredns-7d764666f9-lnj9t\" (UID: \"54520bb9-f644-497d-8cfe-b9f4367c3def\") " pod="kube-system/coredns-7d764666f9-lnj9t" Apr 24 23:35:55.094513 kubelet[2581]: I0424 23:35:55.092510 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/318078f6-c2fc-4219-8c64-82b649d6416b-nginx-config\") pod \"whisker-7d4c795b57-vskxf\" (UID: \"318078f6-c2fc-4219-8c64-82b649d6416b\") " pod="calico-system/whisker-7d4c795b57-vskxf" Apr 24 23:35:55.094513 kubelet[2581]: I0424 23:35:55.092587 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/318078f6-c2fc-4219-8c64-82b649d6416b-whisker-backend-key-pair\") pod \"whisker-7d4c795b57-vskxf\" (UID: \"318078f6-c2fc-4219-8c64-82b649d6416b\") " pod="calico-system/whisker-7d4c795b57-vskxf" Apr 24 23:35:55.094513 kubelet[2581]: I0424 23:35:55.092630 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad82956a-a9d5-482d-9feb-ead05df18123-tigera-ca-bundle\") pod \"calico-kube-controllers-54d98b476c-cxbz2\" (UID: \"ad82956a-a9d5-482d-9feb-ead05df18123\") " pod="calico-system/calico-kube-controllers-54d98b476c-cxbz2" Apr 24 23:35:55.094513 kubelet[2581]: I0424 23:35:55.092665 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-557r6\" (UniqueName: \"kubernetes.io/projected/7e35c05e-9f62-47d8-9c32-7f4c38496c65-kube-api-access-557r6\") pod \"coredns-7d764666f9-l2x8h\" (UID: \"7e35c05e-9f62-47d8-9c32-7f4c38496c65\") " pod="kube-system/coredns-7d764666f9-l2x8h" Apr 24 23:35:55.094513 kubelet[2581]: I0424 23:35:55.092763 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/318078f6-c2fc-4219-8c64-82b649d6416b-whisker-ca-bundle\") pod \"whisker-7d4c795b57-vskxf\" (UID: \"318078f6-c2fc-4219-8c64-82b649d6416b\") " pod="calico-system/whisker-7d4c795b57-vskxf" Apr 24 23:35:55.095030 kubelet[2581]: I0424 23:35:55.092793 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg657\" (UniqueName: \"kubernetes.io/projected/ad82956a-a9d5-482d-9feb-ead05df18123-kube-api-access-mg657\") pod \"calico-kube-controllers-54d98b476c-cxbz2\" (UID: \"ad82956a-a9d5-482d-9feb-ead05df18123\") " pod="calico-system/calico-kube-controllers-54d98b476c-cxbz2" Apr 24 23:35:55.095030 kubelet[2581]: I0424 23:35:55.092876 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqrz8\" (UniqueName: \"kubernetes.io/projected/97bad9ab-01a6-4cd6-89bf-ca7dc9310125-kube-api-access-lqrz8\") pod \"calico-apiserver-ccf7f6874-fxzbx\" (UID: \"97bad9ab-01a6-4cd6-89bf-ca7dc9310125\") " pod="calico-system/calico-apiserver-ccf7f6874-fxzbx" Apr 24 23:35:55.095030 kubelet[2581]: I0424 23:35:55.093739 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9lkt\" (UniqueName: \"kubernetes.io/projected/318078f6-c2fc-4219-8c64-82b649d6416b-kube-api-access-g9lkt\") pod \"whisker-7d4c795b57-vskxf\" (UID: \"318078f6-c2fc-4219-8c64-82b649d6416b\") " pod="calico-system/whisker-7d4c795b57-vskxf" Apr 24 23:35:55.095030 kubelet[2581]: I0424 23:35:55.093783 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/97bad9ab-01a6-4cd6-89bf-ca7dc9310125-calico-apiserver-certs\") pod \"calico-apiserver-ccf7f6874-fxzbx\" (UID: \"97bad9ab-01a6-4cd6-89bf-ca7dc9310125\") " pod="calico-system/calico-apiserver-ccf7f6874-fxzbx" Apr 24 23:35:55.095030 kubelet[2581]: I0424 23:35:55.093967 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e35c05e-9f62-47d8-9c32-7f4c38496c65-config-volume\") pod \"coredns-7d764666f9-l2x8h\" (UID: \"7e35c05e-9f62-47d8-9c32-7f4c38496c65\") " pod="kube-system/coredns-7d764666f9-l2x8h" Apr 24 23:35:55.217908 containerd[1462]: time="2026-04-24T23:35:55.217412080Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-lnj9t,Uid:54520bb9-f644-497d-8cfe-b9f4367c3def,Namespace:kube-system,Attempt:0,}" Apr 24 23:35:55.245552 containerd[1462]: time="2026-04-24T23:35:55.245479076Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ccf7f6874-6drmn,Uid:6dca57dd-795d-4761-9e86-5a637e03b75c,Namespace:calico-system,Attempt:0,}" Apr 24 23:35:55.246362 containerd[1462]: time="2026-04-24T23:35:55.246314255Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-9xhf6,Uid:a67a5a23-28c6-44c6-8e44-b199e7391183,Namespace:calico-system,Attempt:0,}" Apr 24 23:35:55.258016 containerd[1462]: time="2026-04-24T23:35:55.257972277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d4c795b57-vskxf,Uid:318078f6-c2fc-4219-8c64-82b649d6416b,Namespace:calico-system,Attempt:0,}" Apr 24 23:35:55.269318 containerd[1462]: time="2026-04-24T23:35:55.269271800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-l2x8h,Uid:7e35c05e-9f62-47d8-9c32-7f4c38496c65,Namespace:kube-system,Attempt:0,}" Apr 24 23:35:55.279253 containerd[1462]: time="2026-04-24T23:35:55.278799147Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54d98b476c-cxbz2,Uid:ad82956a-a9d5-482d-9feb-ead05df18123,Namespace:calico-system,Attempt:0,}" Apr 24 23:35:55.295005 containerd[1462]: time="2026-04-24T23:35:55.294955999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ccf7f6874-fxzbx,Uid:97bad9ab-01a6-4cd6-89bf-ca7dc9310125,Namespace:calico-system,Attempt:0,}" Apr 24 23:35:55.305893 containerd[1462]: time="2026-04-24T23:35:55.305543763Z" level=info msg="CreateContainer within sandbox \"fca809d1a85db678180d95d0f5c3ef2fc923086d38ac81e6fb3ac689266ee6e0\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Apr 24 23:35:55.390619 containerd[1462]: time="2026-04-24T23:35:55.389964588Z" level=info msg="CreateContainer within sandbox \"fca809d1a85db678180d95d0f5c3ef2fc923086d38ac81e6fb3ac689266ee6e0\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"64365ef6c453321921fee54c1badc759e4738c4a10e94e6e0b6985e1b09095c9\"" Apr 24 23:35:55.396977 containerd[1462]: time="2026-04-24T23:35:55.396846774Z" level=info msg="StartContainer for \"64365ef6c453321921fee54c1badc759e4738c4a10e94e6e0b6985e1b09095c9\"" Apr 24 23:35:55.451512 containerd[1462]: time="2026-04-24T23:35:55.451346934Z" level=error msg="Failed to destroy network for sandbox \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.453350 containerd[1462]: time="2026-04-24T23:35:55.453209324Z" level=error msg="encountered an error cleaning up failed sandbox \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.453350 containerd[1462]: time="2026-04-24T23:35:55.453314462Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-lnj9t,Uid:54520bb9-f644-497d-8cfe-b9f4367c3def,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.453759 kubelet[2581]: E0424 23:35:55.453716 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.454491 kubelet[2581]: E0424 23:35:55.453785 2581 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-lnj9t" Apr 24 23:35:55.454491 kubelet[2581]: E0424 23:35:55.453810 2581 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-lnj9t" Apr 24 23:35:55.454491 kubelet[2581]: E0424 23:35:55.453868 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-lnj9t_kube-system(54520bb9-f644-497d-8cfe-b9f4367c3def)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-lnj9t_kube-system(54520bb9-f644-497d-8cfe-b9f4367c3def)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-lnj9t" podUID="54520bb9-f644-497d-8cfe-b9f4367c3def" Apr 24 23:35:55.465246 containerd[1462]: time="2026-04-24T23:35:55.465185280Z" level=error msg="Failed to destroy network for sandbox \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.465626 containerd[1462]: time="2026-04-24T23:35:55.465590187Z" level=error msg="encountered an error cleaning up failed sandbox \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.465688 containerd[1462]: time="2026-04-24T23:35:55.465649957Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ccf7f6874-6drmn,Uid:6dca57dd-795d-4761-9e86-5a637e03b75c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.466061 kubelet[2581]: E0424 23:35:55.465996 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.466613 kubelet[2581]: E0424 23:35:55.466070 2581 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-ccf7f6874-6drmn" Apr 24 23:35:55.466613 kubelet[2581]: E0424 23:35:55.466088 2581 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-ccf7f6874-6drmn" Apr 24 23:35:55.466613 kubelet[2581]: E0424 23:35:55.466189 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-ccf7f6874-6drmn_calico-system(6dca57dd-795d-4761-9e86-5a637e03b75c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-ccf7f6874-6drmn_calico-system(6dca57dd-795d-4761-9e86-5a637e03b75c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-ccf7f6874-6drmn" podUID="6dca57dd-795d-4761-9e86-5a637e03b75c" Apr 24 23:35:55.476095 containerd[1462]: time="2026-04-24T23:35:55.475953193Z" level=error msg="Failed to destroy network for sandbox \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.477189 containerd[1462]: time="2026-04-24T23:35:55.477098104Z" level=error msg="encountered an error cleaning up failed sandbox \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.478034 containerd[1462]: time="2026-04-24T23:35:55.477909199Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-9xhf6,Uid:a67a5a23-28c6-44c6-8e44-b199e7391183,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.478752 kubelet[2581]: E0424 23:35:55.478166 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.478752 kubelet[2581]: E0424 23:35:55.478219 2581 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-9xhf6" Apr 24 23:35:55.478752 kubelet[2581]: E0424 23:35:55.478238 2581 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-9xhf6" Apr 24 23:35:55.478902 kubelet[2581]: E0424 23:35:55.478286 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-9xhf6_calico-system(a67a5a23-28c6-44c6-8e44-b199e7391183)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-9xhf6_calico-system(a67a5a23-28c6-44c6-8e44-b199e7391183)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-9xhf6" podUID="a67a5a23-28c6-44c6-8e44-b199e7391183" Apr 24 23:35:55.528356 systemd[1]: Started cri-containerd-64365ef6c453321921fee54c1badc759e4738c4a10e94e6e0b6985e1b09095c9.scope - libcontainer container 64365ef6c453321921fee54c1badc759e4738c4a10e94e6e0b6985e1b09095c9. Apr 24 23:35:55.574273 containerd[1462]: time="2026-04-24T23:35:55.574137151Z" level=error msg="Failed to destroy network for sandbox \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.576621 containerd[1462]: time="2026-04-24T23:35:55.576411290Z" level=error msg="encountered an error cleaning up failed sandbox \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.576621 containerd[1462]: time="2026-04-24T23:35:55.576485342Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d4c795b57-vskxf,Uid:318078f6-c2fc-4219-8c64-82b649d6416b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.576817 kubelet[2581]: E0424 23:35:55.576702 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.576817 kubelet[2581]: E0424 23:35:55.576752 2581 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7d4c795b57-vskxf" Apr 24 23:35:55.576817 kubelet[2581]: E0424 23:35:55.576772 2581 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7d4c795b57-vskxf" Apr 24 23:35:55.576957 kubelet[2581]: E0424 23:35:55.576821 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7d4c795b57-vskxf_calico-system(318078f6-c2fc-4219-8c64-82b649d6416b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7d4c795b57-vskxf_calico-system(318078f6-c2fc-4219-8c64-82b649d6416b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7d4c795b57-vskxf" podUID="318078f6-c2fc-4219-8c64-82b649d6416b" Apr 24 23:35:55.586808 containerd[1462]: time="2026-04-24T23:35:55.586671599Z" level=error msg="Failed to destroy network for sandbox \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.587371 containerd[1462]: time="2026-04-24T23:35:55.587246175Z" level=error msg="encountered an error cleaning up failed sandbox \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.587371 containerd[1462]: time="2026-04-24T23:35:55.587309426Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-l2x8h,Uid:7e35c05e-9f62-47d8-9c32-7f4c38496c65,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.587920 kubelet[2581]: E0424 23:35:55.587502 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.587920 kubelet[2581]: E0424 23:35:55.587562 2581 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-l2x8h" Apr 24 23:35:55.587920 kubelet[2581]: E0424 23:35:55.587580 2581 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-l2x8h" Apr 24 23:35:55.588172 kubelet[2581]: E0424 23:35:55.587628 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-l2x8h_kube-system(7e35c05e-9f62-47d8-9c32-7f4c38496c65)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-l2x8h_kube-system(7e35c05e-9f62-47d8-9c32-7f4c38496c65)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-l2x8h" podUID="7e35c05e-9f62-47d8-9c32-7f4c38496c65" Apr 24 23:35:55.593931 containerd[1462]: time="2026-04-24T23:35:55.593867358Z" level=info msg="StartContainer for \"64365ef6c453321921fee54c1badc759e4738c4a10e94e6e0b6985e1b09095c9\" returns successfully" Apr 24 23:35:55.623145 containerd[1462]: time="2026-04-24T23:35:55.623037778Z" level=error msg="Failed to destroy network for sandbox \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.624776 containerd[1462]: time="2026-04-24T23:35:55.623387876Z" level=error msg="encountered an error cleaning up failed sandbox \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.624776 containerd[1462]: time="2026-04-24T23:35:55.623448446Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54d98b476c-cxbz2,Uid:ad82956a-a9d5-482d-9feb-ead05df18123,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.624977 kubelet[2581]: E0424 23:35:55.624327 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.624977 kubelet[2581]: E0424 23:35:55.624397 2581 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-54d98b476c-cxbz2" Apr 24 23:35:55.624977 kubelet[2581]: E0424 23:35:55.624416 2581 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-54d98b476c-cxbz2" Apr 24 23:35:55.625071 kubelet[2581]: E0424 23:35:55.624496 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-54d98b476c-cxbz2_calico-system(ad82956a-a9d5-482d-9feb-ead05df18123)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-54d98b476c-cxbz2_calico-system(ad82956a-a9d5-482d-9feb-ead05df18123)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-54d98b476c-cxbz2" podUID="ad82956a-a9d5-482d-9feb-ead05df18123" Apr 24 23:35:55.631497 containerd[1462]: time="2026-04-24T23:35:55.631398931Z" level=error msg="Failed to destroy network for sandbox \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.632681 containerd[1462]: time="2026-04-24T23:35:55.632565925Z" level=error msg="encountered an error cleaning up failed sandbox \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.632681 containerd[1462]: time="2026-04-24T23:35:55.632636977Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ccf7f6874-fxzbx,Uid:97bad9ab-01a6-4cd6-89bf-ca7dc9310125,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.633528 kubelet[2581]: E0424 23:35:55.633026 2581 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Apr 24 23:35:55.633528 kubelet[2581]: E0424 23:35:55.633073 2581 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-ccf7f6874-fxzbx" Apr 24 23:35:55.633528 kubelet[2581]: E0424 23:35:55.633097 2581 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-ccf7f6874-fxzbx" Apr 24 23:35:55.633626 kubelet[2581]: E0424 23:35:55.633160 2581 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-ccf7f6874-fxzbx_calico-system(97bad9ab-01a6-4cd6-89bf-ca7dc9310125)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-ccf7f6874-fxzbx_calico-system(97bad9ab-01a6-4cd6-89bf-ca7dc9310125)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-ccf7f6874-fxzbx" podUID="97bad9ab-01a6-4cd6-89bf-ca7dc9310125" Apr 24 23:35:56.125646 systemd[1]: Created slice kubepods-besteffort-pod31ae9065_a5e7_49a3_95e7_0d8d7d4e7efa.slice - libcontainer container kubepods-besteffort-pod31ae9065_a5e7_49a3_95e7_0d8d7d4e7efa.slice. Apr 24 23:35:56.132360 containerd[1462]: time="2026-04-24T23:35:56.132300464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-drdkn,Uid:31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa,Namespace:calico-system,Attempt:0,}" Apr 24 23:35:56.276958 kubelet[2581]: I0424 23:35:56.276793 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:35:56.278255 containerd[1462]: time="2026-04-24T23:35:56.278186811Z" level=info msg="StopPodSandbox for \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\"" Apr 24 23:35:56.281789 containerd[1462]: time="2026-04-24T23:35:56.280845959Z" level=info msg="Ensure that sandbox aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd in task-service has been cleanup successfully" Apr 24 23:35:56.282360 kubelet[2581]: I0424 23:35:56.281238 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:35:56.282418 containerd[1462]: time="2026-04-24T23:35:56.282191255Z" level=info msg="StopPodSandbox for \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\"" Apr 24 23:35:56.282602 containerd[1462]: time="2026-04-24T23:35:56.282561115Z" level=info msg="Ensure that sandbox 7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7 in task-service has been cleanup successfully" Apr 24 23:35:56.288154 kubelet[2581]: I0424 23:35:56.287517 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:35:56.293039 containerd[1462]: time="2026-04-24T23:35:56.292724630Z" level=info msg="StopPodSandbox for \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\"" Apr 24 23:35:56.295465 containerd[1462]: time="2026-04-24T23:35:56.295413622Z" level=info msg="Ensure that sandbox 9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d in task-service has been cleanup successfully" Apr 24 23:35:56.335699 kubelet[2581]: I0424 23:35:56.335652 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:35:56.339205 containerd[1462]: time="2026-04-24T23:35:56.339157619Z" level=info msg="StopPodSandbox for \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\"" Apr 24 23:35:56.339753 containerd[1462]: time="2026-04-24T23:35:56.339715748Z" level=info msg="Ensure that sandbox 1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23 in task-service has been cleanup successfully" Apr 24 23:35:56.340754 kubelet[2581]: I0424 23:35:56.340730 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:35:56.341138 containerd[1462]: time="2026-04-24T23:35:56.341096411Z" level=info msg="StopPodSandbox for \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\"" Apr 24 23:35:56.341391 containerd[1462]: time="2026-04-24T23:35:56.341370495Z" level=info msg="Ensure that sandbox a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798 in task-service has been cleanup successfully" Apr 24 23:35:56.345153 kubelet[2581]: I0424 23:35:56.345111 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:35:56.347349 containerd[1462]: time="2026-04-24T23:35:56.347312210Z" level=info msg="StopPodSandbox for \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\"" Apr 24 23:35:56.347738 containerd[1462]: time="2026-04-24T23:35:56.347718396Z" level=info msg="Ensure that sandbox b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef in task-service has been cleanup successfully" Apr 24 23:35:56.359576 kubelet[2581]: I0424 23:35:56.358785 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:35:56.359944 containerd[1462]: time="2026-04-24T23:35:56.359914398Z" level=info msg="StopPodSandbox for \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\"" Apr 24 23:35:56.360428 containerd[1462]: time="2026-04-24T23:35:56.360403236Z" level=info msg="Ensure that sandbox de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72 in task-service has been cleanup successfully" Apr 24 23:35:56.413751 kubelet[2581]: I0424 23:35:56.412942 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-ldsmt" podStartSLOduration=1.734445369 podStartE2EDuration="16.412926605s" podCreationTimestamp="2026-04-24 23:35:40 +0000 UTC" firstStartedPulling="2026-04-24 23:35:40.594960259 +0000 UTC m=+20.638518575" lastFinishedPulling="2026-04-24 23:35:55.273441495 +0000 UTC m=+35.316999811" observedRunningTime="2026-04-24 23:35:56.381451782 +0000 UTC m=+36.425010098" watchObservedRunningTime="2026-04-24 23:35:56.412926605 +0000 UTC m=+36.456484881" Apr 24 23:35:56.426209 systemd-networkd[1386]: cali3696774a504: Link UP Apr 24 23:35:56.426500 systemd-networkd[1386]: cali3696774a504: Gained carrier Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.183 [ERROR][3744] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.208 [INFO][3744] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0 csi-node-driver- calico-system 31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa 731 0 2026-04-24 23:35:40 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-6-n-0494d1f24d csi-node-driver-drdkn eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali3696774a504 [] [] }} ContainerID="4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" Namespace="calico-system" Pod="csi-node-driver-drdkn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.208 [INFO][3744] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" Namespace="calico-system" Pod="csi-node-driver-drdkn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.255 [INFO][3752] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" HandleID="k8s-pod-network.4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" Workload="ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.270 [INFO][3752] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" HandleID="k8s-pod-network.4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" Workload="ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002734a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-0494d1f24d", "pod":"csi-node-driver-drdkn", "timestamp":"2026-04-24 23:35:56.25567427 +0000 UTC"}, Hostname:"ci-4081-3-6-n-0494d1f24d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400038f080)} Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.270 [INFO][3752] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.270 [INFO][3752] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.270 [INFO][3752] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-0494d1f24d' Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.274 [INFO][3752] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.284 [INFO][3752] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.302 [INFO][3752] ipam/ipam.go 526: Trying affinity for 192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.306 [INFO][3752] ipam/ipam.go 160: Attempting to load block cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.311 [INFO][3752] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.311 [INFO][3752] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.314 [INFO][3752] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29 Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.328 [INFO][3752] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.358 [INFO][3752] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.74.65/26] block=192.168.74.64/26 handle="k8s-pod-network.4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.358 [INFO][3752] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.74.65/26] handle="k8s-pod-network.4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.359 [INFO][3752] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:56.491402 containerd[1462]: 2026-04-24 23:35:56.359 [INFO][3752] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.74.65/26] IPv6=[] ContainerID="4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" HandleID="k8s-pod-network.4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" Workload="ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0" Apr 24 23:35:56.492263 containerd[1462]: 2026-04-24 23:35:56.397 [INFO][3744] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" Namespace="calico-system" Pod="csi-node-driver-drdkn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"", Pod:"csi-node-driver-drdkn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.74.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3696774a504", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:56.492263 containerd[1462]: 2026-04-24 23:35:56.397 [INFO][3744] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.65/32] ContainerID="4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" Namespace="calico-system" Pod="csi-node-driver-drdkn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0" Apr 24 23:35:56.492263 containerd[1462]: 2026-04-24 23:35:56.397 [INFO][3744] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3696774a504 ContainerID="4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" Namespace="calico-system" Pod="csi-node-driver-drdkn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0" Apr 24 23:35:56.492263 containerd[1462]: 2026-04-24 23:35:56.435 [INFO][3744] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" Namespace="calico-system" Pod="csi-node-driver-drdkn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0" Apr 24 23:35:56.492263 containerd[1462]: 2026-04-24 23:35:56.439 [INFO][3744] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" Namespace="calico-system" Pod="csi-node-driver-drdkn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29", Pod:"csi-node-driver-drdkn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.74.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3696774a504", MAC:"4a:31:de:3f:0f:79", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:56.492263 containerd[1462]: 2026-04-24 23:35:56.476 [INFO][3744] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29" Namespace="calico-system" Pod="csi-node-driver-drdkn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-csi--node--driver--drdkn-eth0" Apr 24 23:35:56.630645 containerd[1462]: time="2026-04-24T23:35:56.627968596Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:56.630645 containerd[1462]: time="2026-04-24T23:35:56.628026085Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:56.630645 containerd[1462]: time="2026-04-24T23:35:56.628049449Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:56.630645 containerd[1462]: time="2026-04-24T23:35:56.628143624Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:56.696343 systemd[1]: Started cri-containerd-4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29.scope - libcontainer container 4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29. Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.464 [INFO][3779] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.464 [INFO][3779] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" iface="eth0" netns="/var/run/netns/cni-16febbd9-309e-eb67-ac93-d2e63782c33e" Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.465 [INFO][3779] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" iface="eth0" netns="/var/run/netns/cni-16febbd9-309e-eb67-ac93-d2e63782c33e" Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.465 [INFO][3779] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" iface="eth0" netns="/var/run/netns/cni-16febbd9-309e-eb67-ac93-d2e63782c33e" Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.465 [INFO][3779] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.465 [INFO][3779] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.612 [INFO][3861] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" HandleID="k8s-pod-network.7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.612 [INFO][3861] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.612 [INFO][3861] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.693 [WARNING][3861] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" HandleID="k8s-pod-network.7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.693 [INFO][3861] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" HandleID="k8s-pod-network.7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.700 [INFO][3861] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:56.722445 containerd[1462]: 2026-04-24 23:35:56.715 [INFO][3779] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:35:56.730495 containerd[1462]: time="2026-04-24T23:35:56.728454160Z" level=info msg="TearDown network for sandbox \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\" successfully" Apr 24 23:35:56.730495 containerd[1462]: time="2026-04-24T23:35:56.728502168Z" level=info msg="StopPodSandbox for \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\" returns successfully" Apr 24 23:35:56.729379 systemd[1]: run-netns-cni\x2d16febbd9\x2d309e\x2deb67\x2dac93\x2dd2e63782c33e.mount: Deactivated successfully. Apr 24 23:35:56.743098 containerd[1462]: time="2026-04-24T23:35:56.742679528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-l2x8h,Uid:7e35c05e-9f62-47d8-9c32-7f4c38496c65,Namespace:kube-system,Attempt:1,}" Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.539 [INFO][3778] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.540 [INFO][3778] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" iface="eth0" netns="/var/run/netns/cni-a444dbaf-1d7a-63cb-37fa-901ebd0c7235" Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.540 [INFO][3778] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" iface="eth0" netns="/var/run/netns/cni-a444dbaf-1d7a-63cb-37fa-901ebd0c7235" Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.541 [INFO][3778] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" iface="eth0" netns="/var/run/netns/cni-a444dbaf-1d7a-63cb-37fa-901ebd0c7235" Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.541 [INFO][3778] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.541 [INFO][3778] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.734 [INFO][3877] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" HandleID="k8s-pod-network.aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.740 [INFO][3877] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.740 [INFO][3877] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.765 [WARNING][3877] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" HandleID="k8s-pod-network.aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.765 [INFO][3877] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" HandleID="k8s-pod-network.aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.773 [INFO][3877] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:56.786219 containerd[1462]: 2026-04-24 23:35:56.782 [INFO][3778] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:35:56.787425 containerd[1462]: time="2026-04-24T23:35:56.787393961Z" level=info msg="TearDown network for sandbox \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\" successfully" Apr 24 23:35:56.790738 containerd[1462]: time="2026-04-24T23:35:56.790676049Z" level=info msg="StopPodSandbox for \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\" returns successfully" Apr 24 23:35:56.795900 containerd[1462]: time="2026-04-24T23:35:56.795848681Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ccf7f6874-fxzbx,Uid:97bad9ab-01a6-4cd6-89bf-ca7dc9310125,Namespace:calico-system,Attempt:1,}" Apr 24 23:35:56.815472 containerd[1462]: time="2026-04-24T23:35:56.815324614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-drdkn,Uid:31ae9065-a5e7-49a3-95e7-0d8d7d4e7efa,Namespace:calico-system,Attempt:0,} returns sandbox id \"4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29\"" Apr 24 23:35:56.847848 containerd[1462]: time="2026-04-24T23:35:56.846900053Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.630 [INFO][3833] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.635 [INFO][3833] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" iface="eth0" netns="/var/run/netns/cni-cbfdd923-d3d1-acb9-f811-dc598c322e00" Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.636 [INFO][3833] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" iface="eth0" netns="/var/run/netns/cni-cbfdd923-d3d1-acb9-f811-dc598c322e00" Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.642 [INFO][3833] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" iface="eth0" netns="/var/run/netns/cni-cbfdd923-d3d1-acb9-f811-dc598c322e00" Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.642 [INFO][3833] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.642 [INFO][3833] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.819 [INFO][3921] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" HandleID="k8s-pod-network.b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.820 [INFO][3921] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.821 [INFO][3921] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.849 [WARNING][3921] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" HandleID="k8s-pod-network.b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.850 [INFO][3921] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" HandleID="k8s-pod-network.b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.857 [INFO][3921] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:56.872823 containerd[1462]: 2026-04-24 23:35:56.866 [INFO][3833] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:35:56.873480 containerd[1462]: time="2026-04-24T23:35:56.872964685Z" level=info msg="TearDown network for sandbox \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\" successfully" Apr 24 23:35:56.873480 containerd[1462]: time="2026-04-24T23:35:56.872999171Z" level=info msg="StopPodSandbox for \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\" returns successfully" Apr 24 23:35:56.877730 containerd[1462]: time="2026-04-24T23:35:56.877438325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ccf7f6874-6drmn,Uid:6dca57dd-795d-4761-9e86-5a637e03b75c,Namespace:calico-system,Attempt:1,}" Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.733 [INFO][3824] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.735 [INFO][3824] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" iface="eth0" netns="/var/run/netns/cni-53e081dc-202c-8f4d-ea18-7e53549c1507" Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.735 [INFO][3824] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" iface="eth0" netns="/var/run/netns/cni-53e081dc-202c-8f4d-ea18-7e53549c1507" Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.735 [INFO][3824] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" iface="eth0" netns="/var/run/netns/cni-53e081dc-202c-8f4d-ea18-7e53549c1507" Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.735 [INFO][3824] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.735 [INFO][3824] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.821 [INFO][3943] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" HandleID="k8s-pod-network.a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7d4c795b57--vskxf-eth0" Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.822 [INFO][3943] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.857 [INFO][3943] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.874 [WARNING][3943] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" HandleID="k8s-pod-network.a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7d4c795b57--vskxf-eth0" Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.874 [INFO][3943] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" HandleID="k8s-pod-network.a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7d4c795b57--vskxf-eth0" Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.883 [INFO][3943] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:56.898542 containerd[1462]: 2026-04-24 23:35:56.893 [INFO][3824] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:35:56.900879 containerd[1462]: time="2026-04-24T23:35:56.900674783Z" level=info msg="TearDown network for sandbox \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\" successfully" Apr 24 23:35:56.900879 containerd[1462]: time="2026-04-24T23:35:56.900746554Z" level=info msg="StopPodSandbox for \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\" returns successfully" Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.691 [INFO][3795] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.692 [INFO][3795] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" iface="eth0" netns="/var/run/netns/cni-c0c7db74-b213-6dd4-dcf4-3a6fdd352e4c" Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.692 [INFO][3795] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" iface="eth0" netns="/var/run/netns/cni-c0c7db74-b213-6dd4-dcf4-3a6fdd352e4c" Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.696 [INFO][3795] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" iface="eth0" netns="/var/run/netns/cni-c0c7db74-b213-6dd4-dcf4-3a6fdd352e4c" Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.696 [INFO][3795] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.697 [INFO][3795] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.832 [INFO][3936] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" HandleID="k8s-pod-network.9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.832 [INFO][3936] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.884 [INFO][3936] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.902 [WARNING][3936] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" HandleID="k8s-pod-network.9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.902 [INFO][3936] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" HandleID="k8s-pod-network.9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.905 [INFO][3936] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:56.928847 containerd[1462]: 2026-04-24 23:35:56.911 [INFO][3795] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:35:56.929963 containerd[1462]: time="2026-04-24T23:35:56.929435929Z" level=info msg="TearDown network for sandbox \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\" successfully" Apr 24 23:35:56.929963 containerd[1462]: time="2026-04-24T23:35:56.929478296Z" level=info msg="StopPodSandbox for \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\" returns successfully" Apr 24 23:35:56.937487 containerd[1462]: time="2026-04-24T23:35:56.937355163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-9xhf6,Uid:a67a5a23-28c6-44c6-8e44-b199e7391183,Namespace:calico-system,Attempt:1,}" Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.637 [INFO][3853] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.637 [INFO][3853] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" iface="eth0" netns="/var/run/netns/cni-61f4a56c-323f-8342-ab24-3aa08f090bd8" Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.642 [INFO][3853] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" iface="eth0" netns="/var/run/netns/cni-61f4a56c-323f-8342-ab24-3aa08f090bd8" Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.642 [INFO][3853] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" iface="eth0" netns="/var/run/netns/cni-61f4a56c-323f-8342-ab24-3aa08f090bd8" Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.642 [INFO][3853] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.642 [INFO][3853] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.863 [INFO][3911] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" HandleID="k8s-pod-network.de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.863 [INFO][3911] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.906 [INFO][3911] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.925 [WARNING][3911] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" HandleID="k8s-pod-network.de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.925 [INFO][3911] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" HandleID="k8s-pod-network.de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.929 [INFO][3911] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:56.948614 containerd[1462]: 2026-04-24 23:35:56.935 [INFO][3853] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:35:56.949572 containerd[1462]: time="2026-04-24T23:35:56.949109974Z" level=info msg="TearDown network for sandbox \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\" successfully" Apr 24 23:35:56.949572 containerd[1462]: time="2026-04-24T23:35:56.949155981Z" level=info msg="StopPodSandbox for \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\" returns successfully" Apr 24 23:35:56.954411 containerd[1462]: time="2026-04-24T23:35:56.953157425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-lnj9t,Uid:54520bb9-f644-497d-8cfe-b9f4367c3def,Namespace:kube-system,Attempt:1,}" Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.588 [INFO][3834] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.590 [INFO][3834] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" iface="eth0" netns="/var/run/netns/cni-cedbc0e2-3f44-dc2f-290c-1a91696b6e51" Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.591 [INFO][3834] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" iface="eth0" netns="/var/run/netns/cni-cedbc0e2-3f44-dc2f-290c-1a91696b6e51" Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.593 [INFO][3834] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" iface="eth0" netns="/var/run/netns/cni-cedbc0e2-3f44-dc2f-290c-1a91696b6e51" Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.593 [INFO][3834] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.593 [INFO][3834] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.871 [INFO][3898] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" HandleID="k8s-pod-network.1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.871 [INFO][3898] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.932 [INFO][3898] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.963 [WARNING][3898] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" HandleID="k8s-pod-network.1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.964 [INFO][3898] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" HandleID="k8s-pod-network.1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.977 [INFO][3898] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:56.987181 containerd[1462]: 2026-04-24 23:35:56.984 [INFO][3834] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:35:56.988653 containerd[1462]: time="2026-04-24T23:35:56.988608928Z" level=info msg="TearDown network for sandbox \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\" successfully" Apr 24 23:35:56.988653 containerd[1462]: time="2026-04-24T23:35:56.988644933Z" level=info msg="StopPodSandbox for \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\" returns successfully" Apr 24 23:35:56.991804 containerd[1462]: time="2026-04-24T23:35:56.991750113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54d98b476c-cxbz2,Uid:ad82956a-a9d5-482d-9feb-ead05df18123,Namespace:calico-system,Attempt:1,}" Apr 24 23:35:57.123103 kubelet[2581]: I0424 23:35:57.115107 2581 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/318078f6-c2fc-4219-8c64-82b649d6416b-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/318078f6-c2fc-4219-8c64-82b649d6416b-whisker-ca-bundle\") pod \"318078f6-c2fc-4219-8c64-82b649d6416b\" (UID: \"318078f6-c2fc-4219-8c64-82b649d6416b\") " Apr 24 23:35:57.123103 kubelet[2581]: I0424 23:35:57.115227 2581 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/318078f6-c2fc-4219-8c64-82b649d6416b-nginx-config\" (UniqueName: \"kubernetes.io/configmap/318078f6-c2fc-4219-8c64-82b649d6416b-nginx-config\") pod \"318078f6-c2fc-4219-8c64-82b649d6416b\" (UID: \"318078f6-c2fc-4219-8c64-82b649d6416b\") " Apr 24 23:35:57.123103 kubelet[2581]: I0424 23:35:57.115255 2581 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/318078f6-c2fc-4219-8c64-82b649d6416b-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/318078f6-c2fc-4219-8c64-82b649d6416b-whisker-backend-key-pair\") pod \"318078f6-c2fc-4219-8c64-82b649d6416b\" (UID: \"318078f6-c2fc-4219-8c64-82b649d6416b\") " Apr 24 23:35:57.123103 kubelet[2581]: I0424 23:35:57.115300 2581 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/318078f6-c2fc-4219-8c64-82b649d6416b-kube-api-access-g9lkt\" (UniqueName: \"kubernetes.io/projected/318078f6-c2fc-4219-8c64-82b649d6416b-kube-api-access-g9lkt\") pod \"318078f6-c2fc-4219-8c64-82b649d6416b\" (UID: \"318078f6-c2fc-4219-8c64-82b649d6416b\") " Apr 24 23:35:57.123103 kubelet[2581]: I0424 23:35:57.122936 2581 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/318078f6-c2fc-4219-8c64-82b649d6416b-nginx-config" pod "318078f6-c2fc-4219-8c64-82b649d6416b" (UID: "318078f6-c2fc-4219-8c64-82b649d6416b"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:35:57.125615 kubelet[2581]: I0424 23:35:57.123799 2581 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/318078f6-c2fc-4219-8c64-82b649d6416b-whisker-ca-bundle" pod "318078f6-c2fc-4219-8c64-82b649d6416b" (UID: "318078f6-c2fc-4219-8c64-82b649d6416b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:35:57.130411 systemd[1]: run-netns-cni\x2da444dbaf\x2d1d7a\x2d63cb\x2d37fa\x2d901ebd0c7235.mount: Deactivated successfully. Apr 24 23:35:57.130493 systemd[1]: run-netns-cni\x2dcedbc0e2\x2d3f44\x2ddc2f\x2d290c\x2d1a91696b6e51.mount: Deactivated successfully. Apr 24 23:35:57.130541 systemd[1]: run-netns-cni\x2d53e081dc\x2d202c\x2d8f4d\x2dea18\x2d7e53549c1507.mount: Deactivated successfully. Apr 24 23:35:57.130586 systemd[1]: run-netns-cni\x2dcbfdd923\x2dd3d1\x2dacb9\x2df811\x2ddc598c322e00.mount: Deactivated successfully. Apr 24 23:35:57.130630 systemd[1]: run-netns-cni\x2dc0c7db74\x2db213\x2d6dd4\x2ddcf4\x2d3a6fdd352e4c.mount: Deactivated successfully. Apr 24 23:35:57.130670 systemd[1]: run-netns-cni\x2d61f4a56c\x2d323f\x2d8342\x2dab24\x2d3aa08f090bd8.mount: Deactivated successfully. Apr 24 23:35:57.144682 systemd[1]: var-lib-kubelet-pods-318078f6\x2dc2fc\x2d4219\x2d8c64\x2d82b649d6416b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dg9lkt.mount: Deactivated successfully. Apr 24 23:35:57.155021 systemd[1]: var-lib-kubelet-pods-318078f6\x2dc2fc\x2d4219\x2d8c64\x2d82b649d6416b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Apr 24 23:35:57.181464 kubelet[2581]: I0424 23:35:57.181413 2581 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318078f6-c2fc-4219-8c64-82b649d6416b-kube-api-access-g9lkt" pod "318078f6-c2fc-4219-8c64-82b649d6416b" (UID: "318078f6-c2fc-4219-8c64-82b649d6416b"). InnerVolumeSpecName "kube-api-access-g9lkt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:35:57.217031 kubelet[2581]: I0424 23:35:57.216878 2581 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/318078f6-c2fc-4219-8c64-82b649d6416b-nginx-config\") on node \"ci-4081-3-6-n-0494d1f24d\" DevicePath \"\"" Apr 24 23:35:57.217031 kubelet[2581]: I0424 23:35:57.216919 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9lkt\" (UniqueName: \"kubernetes.io/projected/318078f6-c2fc-4219-8c64-82b649d6416b-kube-api-access-g9lkt\") on node \"ci-4081-3-6-n-0494d1f24d\" DevicePath \"\"" Apr 24 23:35:57.217031 kubelet[2581]: I0424 23:35:57.216951 2581 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/318078f6-c2fc-4219-8c64-82b649d6416b-whisker-ca-bundle\") on node \"ci-4081-3-6-n-0494d1f24d\" DevicePath \"\"" Apr 24 23:35:57.262753 kubelet[2581]: I0424 23:35:57.262670 2581 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/318078f6-c2fc-4219-8c64-82b649d6416b-whisker-backend-key-pair" pod "318078f6-c2fc-4219-8c64-82b649d6416b" (UID: "318078f6-c2fc-4219-8c64-82b649d6416b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:35:57.314564 systemd-networkd[1386]: cali1595325ce6f: Link UP Apr 24 23:35:57.316446 systemd-networkd[1386]: cali1595325ce6f: Gained carrier Apr 24 23:35:57.321606 kubelet[2581]: I0424 23:35:57.321556 2581 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/318078f6-c2fc-4219-8c64-82b649d6416b-whisker-backend-key-pair\") on node \"ci-4081-3-6-n-0494d1f24d\" DevicePath \"\"" Apr 24 23:35:57.373009 kubelet[2581]: I0424 23:35:57.371524 2581 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:56.933 [ERROR][3972] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:56.987 [INFO][3972] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0 calico-apiserver-ccf7f6874- calico-system 97bad9ab-01a6-4cd6-89bf-ca7dc9310125 911 0 2026-04-24 23:35:38 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:ccf7f6874 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-0494d1f24d calico-apiserver-ccf7f6874-fxzbx eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali1595325ce6f [] [] }} ContainerID="dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-fxzbx" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:56.988 [INFO][3972] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-fxzbx" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.111 [INFO][4015] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" HandleID="k8s-pod-network.dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.168 [INFO][4015] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" HandleID="k8s-pod-network.dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f8550), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-0494d1f24d", "pod":"calico-apiserver-ccf7f6874-fxzbx", "timestamp":"2026-04-24 23:35:57.111346957 +0000 UTC"}, Hostname:"ci-4081-3-6-n-0494d1f24d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40005126e0)} Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.168 [INFO][4015] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.168 [INFO][4015] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.168 [INFO][4015] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-0494d1f24d' Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.174 [INFO][4015] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.208 [INFO][4015] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.223 [INFO][4015] ipam/ipam.go 526: Trying affinity for 192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.227 [INFO][4015] ipam/ipam.go 160: Attempting to load block cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.234 [INFO][4015] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.234 [INFO][4015] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.239 [INFO][4015] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.258 [INFO][4015] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.268 [INFO][4015] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.74.66/26] block=192.168.74.64/26 handle="k8s-pod-network.dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.268 [INFO][4015] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.74.66/26] handle="k8s-pod-network.dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.268 [INFO][4015] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:57.379735 containerd[1462]: 2026-04-24 23:35:57.268 [INFO][4015] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.74.66/26] IPv6=[] ContainerID="dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" HandleID="k8s-pod-network.dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:35:57.382806 containerd[1462]: 2026-04-24 23:35:57.280 [INFO][3972] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-fxzbx" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0", GenerateName:"calico-apiserver-ccf7f6874-", Namespace:"calico-system", SelfLink:"", UID:"97bad9ab-01a6-4cd6-89bf-ca7dc9310125", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ccf7f6874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"", Pod:"calico-apiserver-ccf7f6874-fxzbx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1595325ce6f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:57.382806 containerd[1462]: 2026-04-24 23:35:57.280 [INFO][3972] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.66/32] ContainerID="dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-fxzbx" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:35:57.382806 containerd[1462]: 2026-04-24 23:35:57.280 [INFO][3972] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1595325ce6f ContainerID="dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-fxzbx" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:35:57.382806 containerd[1462]: 2026-04-24 23:35:57.316 [INFO][3972] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-fxzbx" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:35:57.382806 containerd[1462]: 2026-04-24 23:35:57.330 [INFO][3972] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-fxzbx" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0", GenerateName:"calico-apiserver-ccf7f6874-", Namespace:"calico-system", SelfLink:"", UID:"97bad9ab-01a6-4cd6-89bf-ca7dc9310125", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ccf7f6874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a", Pod:"calico-apiserver-ccf7f6874-fxzbx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1595325ce6f", MAC:"3e:91:e0:8c:d5:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:57.382806 containerd[1462]: 2026-04-24 23:35:57.367 [INFO][3972] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-fxzbx" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:35:57.382243 systemd[1]: Removed slice kubepods-besteffort-pod318078f6_c2fc_4219_8c64_82b649d6416b.slice - libcontainer container kubepods-besteffort-pod318078f6_c2fc_4219_8c64_82b649d6416b.slice. Apr 24 23:35:57.435263 systemd-networkd[1386]: cali104746d9e98: Link UP Apr 24 23:35:57.439849 systemd-networkd[1386]: cali104746d9e98: Gained carrier Apr 24 23:35:57.452307 containerd[1462]: time="2026-04-24T23:35:57.452178907Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:57.452307 containerd[1462]: time="2026-04-24T23:35:57.452249358Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:57.452307 containerd[1462]: time="2026-04-24T23:35:57.452276842Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:57.453633 containerd[1462]: time="2026-04-24T23:35:57.452372297Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:57.493305 systemd[1]: Started cri-containerd-dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a.scope - libcontainer container dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a. Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:56.958 [ERROR][3985] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:56.996 [INFO][3985] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0 calico-apiserver-ccf7f6874- calico-system 6dca57dd-795d-4761-9e86-5a637e03b75c 913 0 2026-04-24 23:35:38 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:ccf7f6874 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-0494d1f24d calico-apiserver-ccf7f6874-6drmn eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali104746d9e98 [] [] }} ContainerID="eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-6drmn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:56.996 [INFO][3985] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-6drmn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.132 [INFO][4028] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" HandleID="k8s-pod-network.eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.179 [INFO][4028] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" HandleID="k8s-pod-network.eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004ffc50), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-0494d1f24d", "pod":"calico-apiserver-ccf7f6874-6drmn", "timestamp":"2026-04-24 23:35:57.132453239 +0000 UTC"}, Hostname:"ci-4081-3-6-n-0494d1f24d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400047e420)} Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.180 [INFO][4028] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.270 [INFO][4028] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.270 [INFO][4028] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-0494d1f24d' Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.279 [INFO][4028] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.316 [INFO][4028] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.334 [INFO][4028] ipam/ipam.go 526: Trying affinity for 192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.342 [INFO][4028] ipam/ipam.go 160: Attempting to load block cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.351 [INFO][4028] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.352 [INFO][4028] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.355 [INFO][4028] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292 Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.383 [INFO][4028] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.409 [INFO][4028] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.74.67/26] block=192.168.74.64/26 handle="k8s-pod-network.eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.409 [INFO][4028] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.74.67/26] handle="k8s-pod-network.eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.409 [INFO][4028] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:57.526592 containerd[1462]: 2026-04-24 23:35:57.409 [INFO][4028] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.74.67/26] IPv6=[] ContainerID="eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" HandleID="k8s-pod-network.eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:35:57.532371 containerd[1462]: 2026-04-24 23:35:57.416 [INFO][3985] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-6drmn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0", GenerateName:"calico-apiserver-ccf7f6874-", Namespace:"calico-system", SelfLink:"", UID:"6dca57dd-795d-4761-9e86-5a637e03b75c", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ccf7f6874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"", Pod:"calico-apiserver-ccf7f6874-6drmn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali104746d9e98", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:57.532371 containerd[1462]: 2026-04-24 23:35:57.416 [INFO][3985] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.67/32] ContainerID="eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-6drmn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:35:57.532371 containerd[1462]: 2026-04-24 23:35:57.416 [INFO][3985] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali104746d9e98 ContainerID="eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-6drmn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:35:57.532371 containerd[1462]: 2026-04-24 23:35:57.453 [INFO][3985] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-6drmn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:35:57.532371 containerd[1462]: 2026-04-24 23:35:57.468 [INFO][3985] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-6drmn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0", GenerateName:"calico-apiserver-ccf7f6874-", Namespace:"calico-system", SelfLink:"", UID:"6dca57dd-795d-4761-9e86-5a637e03b75c", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ccf7f6874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292", Pod:"calico-apiserver-ccf7f6874-6drmn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali104746d9e98", MAC:"be:2c:fb:66:6c:ee", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:57.532371 containerd[1462]: 2026-04-24 23:35:57.521 [INFO][3985] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292" Namespace="calico-system" Pod="calico-apiserver-ccf7f6874-6drmn" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:35:57.530368 systemd-networkd[1386]: cali3696774a504: Gained IPv6LL Apr 24 23:35:57.573268 containerd[1462]: time="2026-04-24T23:35:57.572725009Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:57.573948 containerd[1462]: time="2026-04-24T23:35:57.573561539Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:57.573948 containerd[1462]: time="2026-04-24T23:35:57.573669155Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:57.575376 containerd[1462]: time="2026-04-24T23:35:57.574994921Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:57.580448 systemd[1]: Created slice kubepods-besteffort-pod910c4822_4e95_49a7_b2a7_624250e16bad.slice - libcontainer container kubepods-besteffort-pod910c4822_4e95_49a7_b2a7_624250e16bad.slice. Apr 24 23:35:57.635549 systemd[1]: Started cri-containerd-eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292.scope - libcontainer container eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292. Apr 24 23:35:57.692947 containerd[1462]: time="2026-04-24T23:35:57.691427863Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ccf7f6874-fxzbx,Uid:97bad9ab-01a6-4cd6-89bf-ca7dc9310125,Namespace:calico-system,Attempt:1,} returns sandbox id \"dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a\"" Apr 24 23:35:57.725841 kubelet[2581]: I0424 23:35:57.725798 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhjjl\" (UniqueName: \"kubernetes.io/projected/910c4822-4e95-49a7-b2a7-624250e16bad-kube-api-access-fhjjl\") pod \"whisker-7bf446958b-2nzh8\" (UID: \"910c4822-4e95-49a7-b2a7-624250e16bad\") " pod="calico-system/whisker-7bf446958b-2nzh8" Apr 24 23:35:57.725841 kubelet[2581]: I0424 23:35:57.725845 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/910c4822-4e95-49a7-b2a7-624250e16bad-whisker-backend-key-pair\") pod \"whisker-7bf446958b-2nzh8\" (UID: \"910c4822-4e95-49a7-b2a7-624250e16bad\") " pod="calico-system/whisker-7bf446958b-2nzh8" Apr 24 23:35:57.726069 kubelet[2581]: I0424 23:35:57.725868 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/910c4822-4e95-49a7-b2a7-624250e16bad-nginx-config\") pod \"whisker-7bf446958b-2nzh8\" (UID: \"910c4822-4e95-49a7-b2a7-624250e16bad\") " pod="calico-system/whisker-7bf446958b-2nzh8" Apr 24 23:35:57.726069 kubelet[2581]: I0424 23:35:57.725893 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/910c4822-4e95-49a7-b2a7-624250e16bad-whisker-ca-bundle\") pod \"whisker-7bf446958b-2nzh8\" (UID: \"910c4822-4e95-49a7-b2a7-624250e16bad\") " pod="calico-system/whisker-7bf446958b-2nzh8" Apr 24 23:35:57.745610 systemd-networkd[1386]: calid82795e9477: Link UP Apr 24 23:35:57.747588 systemd-networkd[1386]: calid82795e9477: Gained carrier Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:56.957 [ERROR][3956] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:56.987 [INFO][3956] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0 coredns-7d764666f9- kube-system 7e35c05e-9f62-47d8-9c32-7f4c38496c65 909 0 2026-04-24 23:35:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-0494d1f24d coredns-7d764666f9-l2x8h eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid82795e9477 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" Namespace="kube-system" Pod="coredns-7d764666f9-l2x8h" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:56.989 [INFO][3956] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" Namespace="kube-system" Pod="coredns-7d764666f9-l2x8h" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.160 [INFO][4017] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" HandleID="k8s-pod-network.10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.210 [INFO][4017] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" HandleID="k8s-pod-network.10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003d1470), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-0494d1f24d", "pod":"coredns-7d764666f9-l2x8h", "timestamp":"2026-04-24 23:35:57.160011723 +0000 UTC"}, Hostname:"ci-4081-3-6-n-0494d1f24d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40000d6420)} Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.213 [INFO][4017] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.411 [INFO][4017] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.413 [INFO][4017] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-0494d1f24d' Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.439 [INFO][4017] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.521 [INFO][4017] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.606 [INFO][4017] ipam/ipam.go 526: Trying affinity for 192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.654 [INFO][4017] ipam/ipam.go 160: Attempting to load block cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.662 [INFO][4017] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.662 [INFO][4017] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.669 [INFO][4017] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5 Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.697 [INFO][4017] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.729 [INFO][4017] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.74.68/26] block=192.168.74.64/26 handle="k8s-pod-network.10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.729 [INFO][4017] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.74.68/26] handle="k8s-pod-network.10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.729 [INFO][4017] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:57.807286 containerd[1462]: 2026-04-24 23:35:57.729 [INFO][4017] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.74.68/26] IPv6=[] ContainerID="10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" HandleID="k8s-pod-network.10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:35:57.808401 containerd[1462]: 2026-04-24 23:35:57.733 [INFO][3956] cni-plugin/k8s.go 418: Populated endpoint ContainerID="10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" Namespace="kube-system" Pod="coredns-7d764666f9-l2x8h" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"7e35c05e-9f62-47d8-9c32-7f4c38496c65", ResourceVersion:"909", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"", Pod:"coredns-7d764666f9-l2x8h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid82795e9477", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:57.808401 containerd[1462]: 2026-04-24 23:35:57.733 [INFO][3956] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.68/32] ContainerID="10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" Namespace="kube-system" Pod="coredns-7d764666f9-l2x8h" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:35:57.808401 containerd[1462]: 2026-04-24 23:35:57.733 [INFO][3956] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid82795e9477 ContainerID="10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" Namespace="kube-system" Pod="coredns-7d764666f9-l2x8h" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:35:57.808401 containerd[1462]: 2026-04-24 23:35:57.749 [INFO][3956] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" Namespace="kube-system" Pod="coredns-7d764666f9-l2x8h" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:35:57.808401 containerd[1462]: 2026-04-24 23:35:57.751 [INFO][3956] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" Namespace="kube-system" Pod="coredns-7d764666f9-l2x8h" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"7e35c05e-9f62-47d8-9c32-7f4c38496c65", ResourceVersion:"909", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5", Pod:"coredns-7d764666f9-l2x8h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid82795e9477", MAC:"5e:62:0b:52:eb:cd", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:57.808595 containerd[1462]: 2026-04-24 23:35:57.799 [INFO][3956] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5" Namespace="kube-system" Pod="coredns-7d764666f9-l2x8h" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:35:57.840142 containerd[1462]: time="2026-04-24T23:35:57.840062172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-ccf7f6874-6drmn,Uid:6dca57dd-795d-4761-9e86-5a637e03b75c,Namespace:calico-system,Attempt:1,} returns sandbox id \"eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292\"" Apr 24 23:35:57.872356 containerd[1462]: time="2026-04-24T23:35:57.872197728Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:57.872356 containerd[1462]: time="2026-04-24T23:35:57.872291143Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:57.872356 containerd[1462]: time="2026-04-24T23:35:57.872306745Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:57.872613 containerd[1462]: time="2026-04-24T23:35:57.872391398Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:57.892852 containerd[1462]: time="2026-04-24T23:35:57.892451837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bf446958b-2nzh8,Uid:910c4822-4e95-49a7-b2a7-624250e16bad,Namespace:calico-system,Attempt:0,}" Apr 24 23:35:57.904967 systemd-networkd[1386]: cali8d0e349e5bd: Link UP Apr 24 23:35:57.908369 systemd-networkd[1386]: cali8d0e349e5bd: Gained carrier Apr 24 23:35:57.961753 systemd[1]: Started cri-containerd-10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5.scope - libcontainer container 10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5. Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.164 [ERROR][4003] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.211 [INFO][4003] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0 goldmane-9f7667bb8- calico-system a67a5a23-28c6-44c6-8e44-b199e7391183 915 0 2026-04-24 23:35:38 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-6-n-0494d1f24d goldmane-9f7667bb8-9xhf6 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali8d0e349e5bd [] [] }} ContainerID="9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" Namespace="calico-system" Pod="goldmane-9f7667bb8-9xhf6" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.214 [INFO][4003] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" Namespace="calico-system" Pod="goldmane-9f7667bb8-9xhf6" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.348 [INFO][4106] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" HandleID="k8s-pod-network.9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.389 [INFO][4106] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" HandleID="k8s-pod-network.9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400061e120), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-0494d1f24d", "pod":"goldmane-9f7667bb8-9xhf6", "timestamp":"2026-04-24 23:35:57.348826679 +0000 UTC"}, Hostname:"ci-4081-3-6-n-0494d1f24d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x400039adc0)} Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.389 [INFO][4106] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.729 [INFO][4106] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.729 [INFO][4106] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-0494d1f24d' Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.741 [INFO][4106] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.772 [INFO][4106] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.788 [INFO][4106] ipam/ipam.go 526: Trying affinity for 192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.793 [INFO][4106] ipam/ipam.go 160: Attempting to load block cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.813 [INFO][4106] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.813 [INFO][4106] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.818 [INFO][4106] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18 Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.839 [INFO][4106] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.887 [INFO][4106] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.74.69/26] block=192.168.74.64/26 handle="k8s-pod-network.9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.887 [INFO][4106] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.74.69/26] handle="k8s-pod-network.9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.888 [INFO][4106] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:57.970067 containerd[1462]: 2026-04-24 23:35:57.890 [INFO][4106] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.74.69/26] IPv6=[] ContainerID="9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" HandleID="k8s-pod-network.9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:35:57.970669 containerd[1462]: 2026-04-24 23:35:57.898 [INFO][4003] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" Namespace="calico-system" Pod="goldmane-9f7667bb8-9xhf6" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"a67a5a23-28c6-44c6-8e44-b199e7391183", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"", Pod:"goldmane-9f7667bb8-9xhf6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali8d0e349e5bd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:57.970669 containerd[1462]: 2026-04-24 23:35:57.898 [INFO][4003] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.69/32] ContainerID="9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" Namespace="calico-system" Pod="goldmane-9f7667bb8-9xhf6" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:35:57.970669 containerd[1462]: 2026-04-24 23:35:57.898 [INFO][4003] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8d0e349e5bd ContainerID="9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" Namespace="calico-system" Pod="goldmane-9f7667bb8-9xhf6" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:35:57.970669 containerd[1462]: 2026-04-24 23:35:57.930 [INFO][4003] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" Namespace="calico-system" Pod="goldmane-9f7667bb8-9xhf6" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:35:57.970669 containerd[1462]: 2026-04-24 23:35:57.934 [INFO][4003] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" Namespace="calico-system" Pod="goldmane-9f7667bb8-9xhf6" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"a67a5a23-28c6-44c6-8e44-b199e7391183", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18", Pod:"goldmane-9f7667bb8-9xhf6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali8d0e349e5bd", MAC:"46:db:b8:86:cd:d4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:57.970669 containerd[1462]: 2026-04-24 23:35:57.968 [INFO][4003] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18" Namespace="calico-system" Pod="goldmane-9f7667bb8-9xhf6" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:35:58.042506 containerd[1462]: time="2026-04-24T23:35:58.042163466Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:58.042506 containerd[1462]: time="2026-04-24T23:35:58.042255120Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:58.042506 containerd[1462]: time="2026-04-24T23:35:58.042268922Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:58.042506 containerd[1462]: time="2026-04-24T23:35:58.042449749Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:58.094404 systemd[1]: Started cri-containerd-9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18.scope - libcontainer container 9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18. Apr 24 23:35:58.100034 systemd-networkd[1386]: cali3bb547fd16f: Link UP Apr 24 23:35:58.102546 systemd-networkd[1386]: cali3bb547fd16f: Gained carrier Apr 24 23:35:58.118084 kubelet[2581]: I0424 23:35:58.118037 2581 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="318078f6-c2fc-4219-8c64-82b649d6416b" path="/var/lib/kubelet/pods/318078f6-c2fc-4219-8c64-82b649d6416b/volumes" Apr 24 23:35:58.168544 containerd[1462]: time="2026-04-24T23:35:58.168503230Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-l2x8h,Uid:7e35c05e-9f62-47d8-9c32-7f4c38496c65,Namespace:kube-system,Attempt:1,} returns sandbox id \"10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5\"" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.145 [ERROR][4016] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.210 [INFO][4016] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0 coredns-7d764666f9- kube-system 54520bb9-f644-497d-8cfe-b9f4367c3def 914 0 2026-04-24 23:35:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-0494d1f24d coredns-7d764666f9-lnj9t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3bb547fd16f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" Namespace="kube-system" Pod="coredns-7d764666f9-lnj9t" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.210 [INFO][4016] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" Namespace="kube-system" Pod="coredns-7d764666f9-lnj9t" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.408 [INFO][4104] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" HandleID="k8s-pod-network.4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.509 [INFO][4104] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" HandleID="k8s-pod-network.4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000369bd0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-0494d1f24d", "pod":"coredns-7d764666f9-lnj9t", "timestamp":"2026-04-24 23:35:57.408805284 +0000 UTC"}, Hostname:"ci-4081-3-6-n-0494d1f24d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000413ce0)} Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.509 [INFO][4104] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.888 [INFO][4104] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.888 [INFO][4104] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-0494d1f24d' Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.906 [INFO][4104] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.945 [INFO][4104] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.973 [INFO][4104] ipam/ipam.go 526: Trying affinity for 192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.978 [INFO][4104] ipam/ipam.go 160: Attempting to load block cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.989 [INFO][4104] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:57.990 [INFO][4104] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:58.002 [INFO][4104] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041 Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:58.019 [INFO][4104] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:58.057 [INFO][4104] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.74.70/26] block=192.168.74.64/26 handle="k8s-pod-network.4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:58.057 [INFO][4104] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.74.70/26] handle="k8s-pod-network.4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:58.057 [INFO][4104] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:58.170187 containerd[1462]: 2026-04-24 23:35:58.057 [INFO][4104] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.74.70/26] IPv6=[] ContainerID="4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" HandleID="k8s-pod-network.4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:35:58.171978 containerd[1462]: 2026-04-24 23:35:58.089 [INFO][4016] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" Namespace="kube-system" Pod="coredns-7d764666f9-lnj9t" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"54520bb9-f644-497d-8cfe-b9f4367c3def", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"", Pod:"coredns-7d764666f9-lnj9t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3bb547fd16f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:58.171978 containerd[1462]: 2026-04-24 23:35:58.089 [INFO][4016] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.70/32] ContainerID="4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" Namespace="kube-system" Pod="coredns-7d764666f9-lnj9t" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:35:58.171978 containerd[1462]: 2026-04-24 23:35:58.089 [INFO][4016] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3bb547fd16f ContainerID="4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" Namespace="kube-system" Pod="coredns-7d764666f9-lnj9t" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:35:58.171978 containerd[1462]: 2026-04-24 23:35:58.127 [INFO][4016] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" Namespace="kube-system" Pod="coredns-7d764666f9-lnj9t" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:35:58.171978 containerd[1462]: 2026-04-24 23:35:58.135 [INFO][4016] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" Namespace="kube-system" Pod="coredns-7d764666f9-lnj9t" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"54520bb9-f644-497d-8cfe-b9f4367c3def", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041", Pod:"coredns-7d764666f9-lnj9t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3bb547fd16f", MAC:"fe:6f:c4:bb:79:8e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:58.172202 containerd[1462]: 2026-04-24 23:35:58.158 [INFO][4016] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041" Namespace="kube-system" Pod="coredns-7d764666f9-lnj9t" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:35:58.181454 containerd[1462]: time="2026-04-24T23:35:58.180997150Z" level=info msg="CreateContainer within sandbox \"10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 24 23:35:58.209371 systemd-networkd[1386]: calif2c4b987dac: Link UP Apr 24 23:35:58.217318 systemd-networkd[1386]: calif2c4b987dac: Gained carrier Apr 24 23:35:58.245200 containerd[1462]: time="2026-04-24T23:35:58.244679089Z" level=info msg="CreateContainer within sandbox \"10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0b3587c42780ecfcb4f8f8ded0fc01ae870319c8434cf268607e54066eed50c2\"" Apr 24 23:35:58.246055 containerd[1462]: time="2026-04-24T23:35:58.245771573Z" level=info msg="StartContainer for \"0b3587c42780ecfcb4f8f8ded0fc01ae870319c8434cf268607e54066eed50c2\"" Apr 24 23:35:58.322360 systemd[1]: Started cri-containerd-0b3587c42780ecfcb4f8f8ded0fc01ae870319c8434cf268607e54066eed50c2.scope - libcontainer container 0b3587c42780ecfcb4f8f8ded0fc01ae870319c8434cf268607e54066eed50c2. Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:57.137 [ERROR][4038] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:57.206 [INFO][4038] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0 calico-kube-controllers-54d98b476c- calico-system ad82956a-a9d5-482d-9feb-ead05df18123 912 0 2026-04-24 23:35:40 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:54d98b476c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-6-n-0494d1f24d calico-kube-controllers-54d98b476c-cxbz2 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calif2c4b987dac [] [] }} ContainerID="11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" Namespace="calico-system" Pod="calico-kube-controllers-54d98b476c-cxbz2" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:57.206 [INFO][4038] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" Namespace="calico-system" Pod="calico-kube-controllers-54d98b476c-cxbz2" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:57.417 [INFO][4105] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" HandleID="k8s-pod-network.11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:57.519 [INFO][4105] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" HandleID="k8s-pod-network.11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400048ea20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-0494d1f24d", "pod":"calico-kube-controllers-54d98b476c-cxbz2", "timestamp":"2026-04-24 23:35:57.41778696 +0000 UTC"}, Hostname:"ci-4081-3-6-n-0494d1f24d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x4000284580)} Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:57.519 [INFO][4105] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.057 [INFO][4105] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.058 [INFO][4105] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-0494d1f24d' Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.071 [INFO][4105] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.121 [INFO][4105] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.134 [INFO][4105] ipam/ipam.go 526: Trying affinity for 192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.140 [INFO][4105] ipam/ipam.go 160: Attempting to load block cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.145 [INFO][4105] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.145 [INFO][4105] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.152 [INFO][4105] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515 Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.170 [INFO][4105] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.192 [INFO][4105] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.74.71/26] block=192.168.74.64/26 handle="k8s-pod-network.11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.192 [INFO][4105] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.74.71/26] handle="k8s-pod-network.11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.192 [INFO][4105] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:58.336106 containerd[1462]: 2026-04-24 23:35:58.192 [INFO][4105] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.74.71/26] IPv6=[] ContainerID="11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" HandleID="k8s-pod-network.11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:35:58.337706 containerd[1462]: 2026-04-24 23:35:58.198 [INFO][4038] cni-plugin/k8s.go 418: Populated endpoint ContainerID="11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" Namespace="calico-system" Pod="calico-kube-controllers-54d98b476c-cxbz2" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0", GenerateName:"calico-kube-controllers-54d98b476c-", Namespace:"calico-system", SelfLink:"", UID:"ad82956a-a9d5-482d-9feb-ead05df18123", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"54d98b476c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"", Pod:"calico-kube-controllers-54d98b476c-cxbz2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif2c4b987dac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:58.337706 containerd[1462]: 2026-04-24 23:35:58.198 [INFO][4038] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.71/32] ContainerID="11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" Namespace="calico-system" Pod="calico-kube-controllers-54d98b476c-cxbz2" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:35:58.337706 containerd[1462]: 2026-04-24 23:35:58.198 [INFO][4038] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif2c4b987dac ContainerID="11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" Namespace="calico-system" Pod="calico-kube-controllers-54d98b476c-cxbz2" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:35:58.337706 containerd[1462]: 2026-04-24 23:35:58.222 [INFO][4038] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" Namespace="calico-system" Pod="calico-kube-controllers-54d98b476c-cxbz2" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:35:58.337706 containerd[1462]: 2026-04-24 23:35:58.274 [INFO][4038] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" Namespace="calico-system" Pod="calico-kube-controllers-54d98b476c-cxbz2" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0", GenerateName:"calico-kube-controllers-54d98b476c-", Namespace:"calico-system", SelfLink:"", UID:"ad82956a-a9d5-482d-9feb-ead05df18123", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"54d98b476c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515", Pod:"calico-kube-controllers-54d98b476c-cxbz2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif2c4b987dac", MAC:"0e:1c:2f:b6:06:19", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:58.337706 containerd[1462]: 2026-04-24 23:35:58.333 [INFO][4038] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515" Namespace="calico-system" Pod="calico-kube-controllers-54d98b476c-cxbz2" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:35:58.340816 containerd[1462]: time="2026-04-24T23:35:58.338885380Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:58.342241 containerd[1462]: time="2026-04-24T23:35:58.341267858Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:58.342241 containerd[1462]: time="2026-04-24T23:35:58.341365073Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:58.342241 containerd[1462]: time="2026-04-24T23:35:58.341499693Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:58.362334 systemd-networkd[1386]: cali1595325ce6f: Gained IPv6LL Apr 24 23:35:58.378620 containerd[1462]: time="2026-04-24T23:35:58.378567669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-9xhf6,Uid:a67a5a23-28c6-44c6-8e44-b199e7391183,Namespace:calico-system,Attempt:1,} returns sandbox id \"9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18\"" Apr 24 23:35:58.400374 systemd-networkd[1386]: caliab1d0d7bf2f: Link UP Apr 24 23:35:58.407809 systemd-networkd[1386]: caliab1d0d7bf2f: Gained carrier Apr 24 23:35:58.459031 containerd[1462]: time="2026-04-24T23:35:58.455598496Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:58.459031 containerd[1462]: time="2026-04-24T23:35:58.455720354Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:58.459031 containerd[1462]: time="2026-04-24T23:35:58.455801406Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:58.459031 containerd[1462]: time="2026-04-24T23:35:58.455953629Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:57.951 [ERROR][4305] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:57.991 [INFO][4305] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0 whisker-7bf446958b- calico-system 910c4822-4e95-49a7-b2a7-624250e16bad 941 0 2026-04-24 23:35:57 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7bf446958b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-6-n-0494d1f24d whisker-7bf446958b-2nzh8 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliab1d0d7bf2f [] [] }} ContainerID="3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" Namespace="calico-system" Pod="whisker-7bf446958b-2nzh8" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:57.991 [INFO][4305] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" Namespace="calico-system" Pod="whisker-7bf446958b-2nzh8" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.167 [INFO][4338] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" HandleID="k8s-pod-network.3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.196 [INFO][4338] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" HandleID="k8s-pod-network.3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002fa1d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-0494d1f24d", "pod":"whisker-7bf446958b-2nzh8", "timestamp":"2026-04-24 23:35:58.167888498 +0000 UTC"}, Hostname:"ci-4081-3-6-n-0494d1f24d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0x40002e8420)} Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.196 [INFO][4338] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.197 [INFO][4338] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.197 [INFO][4338] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-0494d1f24d' Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.207 [INFO][4338] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.233 [INFO][4338] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.261 [INFO][4338] ipam/ipam.go 526: Trying affinity for 192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.275 [INFO][4338] ipam/ipam.go 160: Attempting to load block cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.310 [INFO][4338] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.74.64/26 host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.310 [INFO][4338] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.74.64/26 handle="k8s-pod-network.3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.325 [INFO][4338] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5 Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.342 [INFO][4338] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.74.64/26 handle="k8s-pod-network.3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.360 [INFO][4338] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.74.72/26] block=192.168.74.64/26 handle="k8s-pod-network.3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.360 [INFO][4338] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.74.72/26] handle="k8s-pod-network.3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" host="ci-4081-3-6-n-0494d1f24d" Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.361 [INFO][4338] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:35:58.470649 containerd[1462]: 2026-04-24 23:35:58.361 [INFO][4338] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.74.72/26] IPv6=[] ContainerID="3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" HandleID="k8s-pod-network.3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0" Apr 24 23:35:58.469626 systemd[1]: Started cri-containerd-4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041.scope - libcontainer container 4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041. Apr 24 23:35:58.471261 containerd[1462]: 2026-04-24 23:35:58.390 [INFO][4305] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" Namespace="calico-system" Pod="whisker-7bf446958b-2nzh8" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0", GenerateName:"whisker-7bf446958b-", Namespace:"calico-system", SelfLink:"", UID:"910c4822-4e95-49a7-b2a7-624250e16bad", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bf446958b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"", Pod:"whisker-7bf446958b-2nzh8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.74.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliab1d0d7bf2f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:58.471261 containerd[1462]: 2026-04-24 23:35:58.390 [INFO][4305] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.72/32] ContainerID="3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" Namespace="calico-system" Pod="whisker-7bf446958b-2nzh8" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0" Apr 24 23:35:58.471261 containerd[1462]: 2026-04-24 23:35:58.390 [INFO][4305] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliab1d0d7bf2f ContainerID="3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" Namespace="calico-system" Pod="whisker-7bf446958b-2nzh8" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0" Apr 24 23:35:58.471261 containerd[1462]: 2026-04-24 23:35:58.419 [INFO][4305] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" Namespace="calico-system" Pod="whisker-7bf446958b-2nzh8" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0" Apr 24 23:35:58.471261 containerd[1462]: 2026-04-24 23:35:58.420 [INFO][4305] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" Namespace="calico-system" Pod="whisker-7bf446958b-2nzh8" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0", GenerateName:"whisker-7bf446958b-", Namespace:"calico-system", SelfLink:"", UID:"910c4822-4e95-49a7-b2a7-624250e16bad", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bf446958b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5", Pod:"whisker-7bf446958b-2nzh8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.74.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliab1d0d7bf2f", MAC:"62:c4:51:10:89:2d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:35:58.471261 containerd[1462]: 2026-04-24 23:35:58.454 [INFO][4305] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5" Namespace="calico-system" Pod="whisker-7bf446958b-2nzh8" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7bf446958b--2nzh8-eth0" Apr 24 23:35:58.476159 containerd[1462]: time="2026-04-24T23:35:58.475881587Z" level=info msg="StartContainer for \"0b3587c42780ecfcb4f8f8ded0fc01ae870319c8434cf268607e54066eed50c2\" returns successfully" Apr 24 23:35:58.507371 systemd[1]: Started cri-containerd-11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515.scope - libcontainer container 11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515. Apr 24 23:35:58.519768 containerd[1462]: time="2026-04-24T23:35:58.519287076Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Apr 24 23:35:58.519768 containerd[1462]: time="2026-04-24T23:35:58.519366048Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Apr 24 23:35:58.519768 containerd[1462]: time="2026-04-24T23:35:58.519378730Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:58.519768 containerd[1462]: time="2026-04-24T23:35:58.519467743Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Apr 24 23:35:58.550751 systemd[1]: Started cri-containerd-3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5.scope - libcontainer container 3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5. Apr 24 23:35:58.577006 containerd[1462]: time="2026-04-24T23:35:58.576633542Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-lnj9t,Uid:54520bb9-f644-497d-8cfe-b9f4367c3def,Namespace:kube-system,Attempt:1,} returns sandbox id \"4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041\"" Apr 24 23:35:58.593925 containerd[1462]: time="2026-04-24T23:35:58.593879416Z" level=info msg="CreateContainer within sandbox \"4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Apr 24 23:35:58.617942 containerd[1462]: time="2026-04-24T23:35:58.617720923Z" level=info msg="CreateContainer within sandbox \"4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ebf6ed7ba365856845e914dc9d941a6833f3f866a2f3b1afb358680d9ee6dc5d\"" Apr 24 23:35:58.622737 containerd[1462]: time="2026-04-24T23:35:58.622677508Z" level=info msg="StartContainer for \"ebf6ed7ba365856845e914dc9d941a6833f3f866a2f3b1afb358680d9ee6dc5d\"" Apr 24 23:35:58.686396 systemd[1]: Started cri-containerd-ebf6ed7ba365856845e914dc9d941a6833f3f866a2f3b1afb358680d9ee6dc5d.scope - libcontainer container ebf6ed7ba365856845e914dc9d941a6833f3f866a2f3b1afb358680d9ee6dc5d. Apr 24 23:35:58.714137 containerd[1462]: time="2026-04-24T23:35:58.712646282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54d98b476c-cxbz2,Uid:ad82956a-a9d5-482d-9feb-ead05df18123,Namespace:calico-system,Attempt:1,} returns sandbox id \"11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515\"" Apr 24 23:35:58.749332 containerd[1462]: time="2026-04-24T23:35:58.749282953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bf446958b-2nzh8,Uid:910c4822-4e95-49a7-b2a7-624250e16bad,Namespace:calico-system,Attempt:0,} returns sandbox id \"3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5\"" Apr 24 23:35:58.768308 containerd[1462]: time="2026-04-24T23:35:58.768255007Z" level=info msg="StartContainer for \"ebf6ed7ba365856845e914dc9d941a6833f3f866a2f3b1afb358680d9ee6dc5d\" returns successfully" Apr 24 23:35:58.912268 kubelet[2581]: I0424 23:35:58.912029 2581 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:35:58.981235 kernel: calico-node[4630]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Apr 24 23:35:59.114982 systemd[1]: run-containerd-runc-k8s.io-0b3587c42780ecfcb4f8f8ded0fc01ae870319c8434cf268607e54066eed50c2-runc.8UCcWA.mount: Deactivated successfully. Apr 24 23:35:59.258295 systemd-networkd[1386]: cali104746d9e98: Gained IPv6LL Apr 24 23:35:59.450561 systemd-networkd[1386]: calid82795e9477: Gained IPv6LL Apr 24 23:35:59.479954 systemd-networkd[1386]: vxlan.calico: Link UP Apr 24 23:35:59.479961 systemd-networkd[1386]: vxlan.calico: Gained carrier Apr 24 23:35:59.515432 systemd-networkd[1386]: calif2c4b987dac: Gained IPv6LL Apr 24 23:35:59.555032 kubelet[2581]: I0424 23:35:59.554948 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-lnj9t" podStartSLOduration=34.554932759 podStartE2EDuration="34.554932759s" podCreationTimestamp="2026-04-24 23:35:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:35:59.5305044 +0000 UTC m=+39.574062716" watchObservedRunningTime="2026-04-24 23:35:59.554932759 +0000 UTC m=+39.598491075" Apr 24 23:35:59.578448 systemd-networkd[1386]: caliab1d0d7bf2f: Gained IPv6LL Apr 24 23:35:59.644349 systemd-networkd[1386]: cali3bb547fd16f: Gained IPv6LL Apr 24 23:35:59.770754 systemd-networkd[1386]: cali8d0e349e5bd: Gained IPv6LL Apr 24 23:36:00.348721 containerd[1462]: time="2026-04-24T23:36:00.348656415Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:00.350499 containerd[1462]: time="2026-04-24T23:36:00.350447908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8261497" Apr 24 23:36:00.350826 containerd[1462]: time="2026-04-24T23:36:00.350747790Z" level=info msg="ImageCreate event name:\"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:00.354173 containerd[1462]: time="2026-04-24T23:36:00.353791220Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:00.354986 containerd[1462]: time="2026-04-24T23:36:00.354950024Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"9659022\" in 3.507990761s" Apr 24 23:36:00.355094 containerd[1462]: time="2026-04-24T23:36:00.355079362Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:9cb4086a1b408b52c6b14e0b81520060e1766ee0243508d29d8a53c7b518051f\"" Apr 24 23:36:00.358052 containerd[1462]: time="2026-04-24T23:36:00.358019217Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 24 23:36:00.364090 containerd[1462]: time="2026-04-24T23:36:00.363922851Z" level=info msg="CreateContainer within sandbox \"4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Apr 24 23:36:00.391249 containerd[1462]: time="2026-04-24T23:36:00.391073486Z" level=info msg="CreateContainer within sandbox \"4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"f0ff9c48a7b7067c50884e1a9c699d68cf37483a0536e5ccf7b6232e93c98ae5\"" Apr 24 23:36:00.393158 containerd[1462]: time="2026-04-24T23:36:00.392402034Z" level=info msg="StartContainer for \"f0ff9c48a7b7067c50884e1a9c699d68cf37483a0536e5ccf7b6232e93c98ae5\"" Apr 24 23:36:00.427643 systemd[1]: run-containerd-runc-k8s.io-f0ff9c48a7b7067c50884e1a9c699d68cf37483a0536e5ccf7b6232e93c98ae5-runc.54ejQc.mount: Deactivated successfully. Apr 24 23:36:00.437986 systemd[1]: Started cri-containerd-f0ff9c48a7b7067c50884e1a9c699d68cf37483a0536e5ccf7b6232e93c98ae5.scope - libcontainer container f0ff9c48a7b7067c50884e1a9c699d68cf37483a0536e5ccf7b6232e93c98ae5. Apr 24 23:36:00.475851 containerd[1462]: time="2026-04-24T23:36:00.475796253Z" level=info msg="StartContainer for \"f0ff9c48a7b7067c50884e1a9c699d68cf37483a0536e5ccf7b6232e93c98ae5\" returns successfully" Apr 24 23:36:01.184317 systemd-networkd[1386]: vxlan.calico: Gained IPv6LL Apr 24 23:36:04.001550 containerd[1462]: time="2026-04-24T23:36:04.000172148Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:04.002325 containerd[1462]: time="2026-04-24T23:36:04.002267935Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=45552315" Apr 24 23:36:04.002600 containerd[1462]: time="2026-04-24T23:36:04.002552091Z" level=info msg="ImageCreate event name:\"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:04.007508 containerd[1462]: time="2026-04-24T23:36:04.007457710Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:04.008679 containerd[1462]: time="2026-04-24T23:36:04.008629497Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 3.649463799s" Apr 24 23:36:04.008679 containerd[1462]: time="2026-04-24T23:36:04.008678984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Apr 24 23:36:04.011618 containerd[1462]: time="2026-04-24T23:36:04.011233386Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Apr 24 23:36:04.016340 containerd[1462]: time="2026-04-24T23:36:04.016257419Z" level=info msg="CreateContainer within sandbox \"dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 24 23:36:04.035605 containerd[1462]: time="2026-04-24T23:36:04.035534530Z" level=info msg="CreateContainer within sandbox \"dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"eb0d1e6610ffedb78826756a0f694fa04fdd980b742b82b782b72625d4b53e0d\"" Apr 24 23:36:04.037613 containerd[1462]: time="2026-04-24T23:36:04.037372962Z" level=info msg="StartContainer for \"eb0d1e6610ffedb78826756a0f694fa04fdd980b742b82b782b72625d4b53e0d\"" Apr 24 23:36:04.076998 systemd[1]: run-containerd-runc-k8s.io-eb0d1e6610ffedb78826756a0f694fa04fdd980b742b82b782b72625d4b53e0d-runc.CiZJGl.mount: Deactivated successfully. Apr 24 23:36:04.085784 systemd[1]: Started cri-containerd-eb0d1e6610ffedb78826756a0f694fa04fdd980b742b82b782b72625d4b53e0d.scope - libcontainer container eb0d1e6610ffedb78826756a0f694fa04fdd980b742b82b782b72625d4b53e0d. Apr 24 23:36:04.133485 containerd[1462]: time="2026-04-24T23:36:04.133432636Z" level=info msg="StartContainer for \"eb0d1e6610ffedb78826756a0f694fa04fdd980b742b82b782b72625d4b53e0d\" returns successfully" Apr 24 23:36:04.456996 containerd[1462]: time="2026-04-24T23:36:04.456938192Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:04.460843 containerd[1462]: time="2026-04-24T23:36:04.460776956Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Apr 24 23:36:04.463221 containerd[1462]: time="2026-04-24T23:36:04.463154456Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"46949856\" in 451.880665ms" Apr 24 23:36:04.463221 containerd[1462]: time="2026-04-24T23:36:04.463206262Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:dca640051f09574f3e8821035bbfae8c638fb7dadca4c9a082e7223a234befc8\"" Apr 24 23:36:04.465926 containerd[1462]: time="2026-04-24T23:36:04.465601805Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Apr 24 23:36:04.469398 containerd[1462]: time="2026-04-24T23:36:04.469213620Z" level=info msg="CreateContainer within sandbox \"eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Apr 24 23:36:04.492661 containerd[1462]: time="2026-04-24T23:36:04.492604730Z" level=info msg="CreateContainer within sandbox \"eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7722eddfcc1ce6e179b78e83de969c16cce62487be2b69e990432fd0322b6127\"" Apr 24 23:36:04.498745 containerd[1462]: time="2026-04-24T23:36:04.497419417Z" level=info msg="StartContainer for \"7722eddfcc1ce6e179b78e83de969c16cce62487be2b69e990432fd0322b6127\"" Apr 24 23:36:04.533961 systemd[1]: Started cri-containerd-7722eddfcc1ce6e179b78e83de969c16cce62487be2b69e990432fd0322b6127.scope - libcontainer container 7722eddfcc1ce6e179b78e83de969c16cce62487be2b69e990432fd0322b6127. Apr 24 23:36:04.560770 kubelet[2581]: I0424 23:36:04.560591 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-l2x8h" podStartSLOduration=39.560573701 podStartE2EDuration="39.560573701s" podCreationTimestamp="2026-04-24 23:35:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:35:59.581966018 +0000 UTC m=+39.625524334" watchObservedRunningTime="2026-04-24 23:36:04.560573701 +0000 UTC m=+44.604131977" Apr 24 23:36:04.599111 containerd[1462]: time="2026-04-24T23:36:04.599037632Z" level=info msg="StartContainer for \"7722eddfcc1ce6e179b78e83de969c16cce62487be2b69e990432fd0322b6127\" returns successfully" Apr 24 23:36:05.545926 kubelet[2581]: I0424 23:36:05.545883 2581 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:36:05.564843 kubelet[2581]: I0424 23:36:05.564770 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-ccf7f6874-fxzbx" podStartSLOduration=21.25580246 podStartE2EDuration="27.564750763s" podCreationTimestamp="2026-04-24 23:35:38 +0000 UTC" firstStartedPulling="2026-04-24 23:35:57.701284076 +0000 UTC m=+37.744842392" lastFinishedPulling="2026-04-24 23:36:04.010232379 +0000 UTC m=+44.053790695" observedRunningTime="2026-04-24 23:36:04.560041594 +0000 UTC m=+44.603599910" watchObservedRunningTime="2026-04-24 23:36:05.564750763 +0000 UTC m=+45.608309079" Apr 24 23:36:06.549261 kubelet[2581]: I0424 23:36:06.548715 2581 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:36:08.698850 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2415801372.mount: Deactivated successfully. Apr 24 23:36:09.104807 containerd[1462]: time="2026-04-24T23:36:09.104611595Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:09.107362 containerd[1462]: time="2026-04-24T23:36:09.107319098Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=51613980" Apr 24 23:36:09.108745 containerd[1462]: time="2026-04-24T23:36:09.108692611Z" level=info msg="ImageCreate event name:\"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:09.112028 containerd[1462]: time="2026-04-24T23:36:09.111957897Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:09.113893 containerd[1462]: time="2026-04-24T23:36:09.113811544Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"51613826\" in 4.648167454s" Apr 24 23:36:09.113893 containerd[1462]: time="2026-04-24T23:36:09.113868151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:5274e98e9b12badfa0d6f106814630212e6de7abb8deaf896423b13e6ebdb41b\"" Apr 24 23:36:09.116132 containerd[1462]: time="2026-04-24T23:36:09.115467730Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Apr 24 23:36:09.134695 containerd[1462]: time="2026-04-24T23:36:09.134624434Z" level=info msg="CreateContainer within sandbox \"9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Apr 24 23:36:09.157199 containerd[1462]: time="2026-04-24T23:36:09.157073066Z" level=info msg="CreateContainer within sandbox \"9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"8aa9837b29bff3121442a897251883688fc548f6f9ee0a71922eb88556d5a3f1\"" Apr 24 23:36:09.161080 containerd[1462]: time="2026-04-24T23:36:09.160452805Z" level=info msg="StartContainer for \"8aa9837b29bff3121442a897251883688fc548f6f9ee0a71922eb88556d5a3f1\"" Apr 24 23:36:09.216537 systemd[1]: Started cri-containerd-8aa9837b29bff3121442a897251883688fc548f6f9ee0a71922eb88556d5a3f1.scope - libcontainer container 8aa9837b29bff3121442a897251883688fc548f6f9ee0a71922eb88556d5a3f1. Apr 24 23:36:09.261325 containerd[1462]: time="2026-04-24T23:36:09.261258847Z" level=info msg="StartContainer for \"8aa9837b29bff3121442a897251883688fc548f6f9ee0a71922eb88556d5a3f1\" returns successfully" Apr 24 23:36:09.606101 kubelet[2581]: I0424 23:36:09.604452 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-ccf7f6874-6drmn" podStartSLOduration=24.985423488 podStartE2EDuration="31.604435177s" podCreationTimestamp="2026-04-24 23:35:38 +0000 UTC" firstStartedPulling="2026-04-24 23:35:57.84551802 +0000 UTC m=+37.889076336" lastFinishedPulling="2026-04-24 23:36:04.464529709 +0000 UTC m=+44.508088025" observedRunningTime="2026-04-24 23:36:05.567596873 +0000 UTC m=+45.611155189" watchObservedRunningTime="2026-04-24 23:36:09.604435177 +0000 UTC m=+49.647993493" Apr 24 23:36:12.017940 containerd[1462]: time="2026-04-24T23:36:12.017895152Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:12.020162 containerd[1462]: time="2026-04-24T23:36:12.020099624Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=49189955" Apr 24 23:36:12.021599 containerd[1462]: time="2026-04-24T23:36:12.021561738Z" level=info msg="ImageCreate event name:\"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:12.024670 containerd[1462]: time="2026-04-24T23:36:12.024631141Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:12.026374 containerd[1462]: time="2026-04-24T23:36:12.026240111Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"50587448\" in 2.910729297s" Apr 24 23:36:12.026374 containerd[1462]: time="2026-04-24T23:36:12.026283115Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:e80fe1ce4f06b0791c077492cd9d5ebf00125a02bbafdcd04d2a64e10cc4ad95\"" Apr 24 23:36:12.029016 containerd[1462]: time="2026-04-24T23:36:12.028282766Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Apr 24 23:36:12.049433 containerd[1462]: time="2026-04-24T23:36:12.049394751Z" level=info msg="CreateContainer within sandbox \"11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Apr 24 23:36:12.074152 containerd[1462]: time="2026-04-24T23:36:12.074066030Z" level=info msg="CreateContainer within sandbox \"11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"d7d9f0e2a18cca3a930a1e7caa3426ddbfa23bdd90214992aeebe2869de54044\"" Apr 24 23:36:12.076895 containerd[1462]: time="2026-04-24T23:36:12.076832282Z" level=info msg="StartContainer for \"d7d9f0e2a18cca3a930a1e7caa3426ddbfa23bdd90214992aeebe2869de54044\"" Apr 24 23:36:12.154482 systemd[1]: Started cri-containerd-d7d9f0e2a18cca3a930a1e7caa3426ddbfa23bdd90214992aeebe2869de54044.scope - libcontainer container d7d9f0e2a18cca3a930a1e7caa3426ddbfa23bdd90214992aeebe2869de54044. Apr 24 23:36:12.196005 containerd[1462]: time="2026-04-24T23:36:12.195611597Z" level=info msg="StartContainer for \"d7d9f0e2a18cca3a930a1e7caa3426ddbfa23bdd90214992aeebe2869de54044\" returns successfully" Apr 24 23:36:12.606489 kubelet[2581]: I0424 23:36:12.606407 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-9xhf6" podStartSLOduration=23.874023515 podStartE2EDuration="34.606390361s" podCreationTimestamp="2026-04-24 23:35:38 +0000 UTC" firstStartedPulling="2026-04-24 23:35:58.382408646 +0000 UTC m=+38.425966962" lastFinishedPulling="2026-04-24 23:36:09.114775492 +0000 UTC m=+49.158333808" observedRunningTime="2026-04-24 23:36:09.605414887 +0000 UTC m=+49.648973163" watchObservedRunningTime="2026-04-24 23:36:12.606390361 +0000 UTC m=+52.649948677" Apr 24 23:36:12.648786 kubelet[2581]: I0424 23:36:12.648713 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-54d98b476c-cxbz2" podStartSLOduration=19.35280417 podStartE2EDuration="32.648695258s" podCreationTimestamp="2026-04-24 23:35:40 +0000 UTC" firstStartedPulling="2026-04-24 23:35:58.732078205 +0000 UTC m=+38.775636481" lastFinishedPulling="2026-04-24 23:36:12.027969173 +0000 UTC m=+52.071527569" observedRunningTime="2026-04-24 23:36:12.60847578 +0000 UTC m=+52.652034297" watchObservedRunningTime="2026-04-24 23:36:12.648695258 +0000 UTC m=+52.692253574" Apr 24 23:36:13.798885 containerd[1462]: time="2026-04-24T23:36:13.797901380Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:13.798885 containerd[1462]: time="2026-04-24T23:36:13.798845597Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=5882804" Apr 24 23:36:13.799931 containerd[1462]: time="2026-04-24T23:36:13.799874344Z" level=info msg="ImageCreate event name:\"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:13.803663 containerd[1462]: time="2026-04-24T23:36:13.802580264Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:13.803663 containerd[1462]: time="2026-04-24T23:36:13.803451594Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7280321\" in 1.775129384s" Apr 24 23:36:13.803663 containerd[1462]: time="2026-04-24T23:36:13.803481877Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:51af4e9dcdb93e51b26a4a6f99272ec2df8de1aef256bb746f2c7c844b8e7b2c\"" Apr 24 23:36:13.805861 containerd[1462]: time="2026-04-24T23:36:13.805805637Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Apr 24 23:36:13.809873 containerd[1462]: time="2026-04-24T23:36:13.809837894Z" level=info msg="CreateContainer within sandbox \"3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Apr 24 23:36:13.829878 containerd[1462]: time="2026-04-24T23:36:13.829840284Z" level=info msg="CreateContainer within sandbox \"3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"45423527140a5523b65f666090ef24b48bc36274f964f13eff02db3f3f78a2fe\"" Apr 24 23:36:13.832282 containerd[1462]: time="2026-04-24T23:36:13.831085853Z" level=info msg="StartContainer for \"45423527140a5523b65f666090ef24b48bc36274f964f13eff02db3f3f78a2fe\"" Apr 24 23:36:13.835752 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3814119580.mount: Deactivated successfully. Apr 24 23:36:13.876348 systemd[1]: Started cri-containerd-45423527140a5523b65f666090ef24b48bc36274f964f13eff02db3f3f78a2fe.scope - libcontainer container 45423527140a5523b65f666090ef24b48bc36274f964f13eff02db3f3f78a2fe. Apr 24 23:36:13.923045 containerd[1462]: time="2026-04-24T23:36:13.922935715Z" level=info msg="StartContainer for \"45423527140a5523b65f666090ef24b48bc36274f964f13eff02db3f3f78a2fe\" returns successfully" Apr 24 23:36:14.038644 systemd[1]: run-containerd-runc-k8s.io-45423527140a5523b65f666090ef24b48bc36274f964f13eff02db3f3f78a2fe-runc.09Xw16.mount: Deactivated successfully. Apr 24 23:36:15.459097 containerd[1462]: time="2026-04-24T23:36:15.459038111Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:15.460372 containerd[1462]: time="2026-04-24T23:36:15.459784626Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=13766291" Apr 24 23:36:15.461608 containerd[1462]: time="2026-04-24T23:36:15.461505718Z" level=info msg="ImageCreate event name:\"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:15.467161 containerd[1462]: time="2026-04-24T23:36:15.467058273Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:15.468325 containerd[1462]: time="2026-04-24T23:36:15.468036211Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"15163768\" in 1.662178848s" Apr 24 23:36:15.468325 containerd[1462]: time="2026-04-24T23:36:15.468078735Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:8195c49a3b504e7ef58a8fc9a0e9ae66ae6ae90ef4998c04591be9588e8fa07e\"" Apr 24 23:36:15.471313 containerd[1462]: time="2026-04-24T23:36:15.471254132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Apr 24 23:36:15.477716 containerd[1462]: time="2026-04-24T23:36:15.477674334Z" level=info msg="CreateContainer within sandbox \"4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Apr 24 23:36:15.569453 containerd[1462]: time="2026-04-24T23:36:15.566947138Z" level=info msg="CreateContainer within sandbox \"4180f624404d2f5709af7d3e0abb4ff432671809d911d57eb7f919c6d52a2a29\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"aac13e584c62f50349d6769da84390e4b2dc5c27e72aaed2b5e4aa52c7e0ffae\"" Apr 24 23:36:15.570508 containerd[1462]: time="2026-04-24T23:36:15.570432567Z" level=info msg="StartContainer for \"aac13e584c62f50349d6769da84390e4b2dc5c27e72aaed2b5e4aa52c7e0ffae\"" Apr 24 23:36:15.614354 systemd[1]: Started cri-containerd-aac13e584c62f50349d6769da84390e4b2dc5c27e72aaed2b5e4aa52c7e0ffae.scope - libcontainer container aac13e584c62f50349d6769da84390e4b2dc5c27e72aaed2b5e4aa52c7e0ffae. Apr 24 23:36:15.649489 containerd[1462]: time="2026-04-24T23:36:15.649426064Z" level=info msg="StartContainer for \"aac13e584c62f50349d6769da84390e4b2dc5c27e72aaed2b5e4aa52c7e0ffae\" returns successfully" Apr 24 23:36:16.242125 kubelet[2581]: I0424 23:36:16.242062 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Apr 24 23:36:16.242125 kubelet[2581]: I0424 23:36:16.242104 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Apr 24 23:36:17.542003 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount711275152.mount: Deactivated successfully. Apr 24 23:36:17.565561 containerd[1462]: time="2026-04-24T23:36:17.564402034Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:17.566036 containerd[1462]: time="2026-04-24T23:36:17.566003309Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=16426594" Apr 24 23:36:17.566970 containerd[1462]: time="2026-04-24T23:36:17.566938960Z" level=info msg="ImageCreate event name:\"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:17.570254 containerd[1462]: time="2026-04-24T23:36:17.570196876Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Apr 24 23:36:17.571723 containerd[1462]: time="2026-04-24T23:36:17.571668178Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"16426424\" in 2.100361681s" Apr 24 23:36:17.571723 containerd[1462]: time="2026-04-24T23:36:17.571720143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:19fab8e13a4d97732973f299576e43f89b889ceff6e3768f711f30e6ace1c662\"" Apr 24 23:36:17.578585 containerd[1462]: time="2026-04-24T23:36:17.578352106Z" level=info msg="CreateContainer within sandbox \"3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Apr 24 23:36:17.603666 containerd[1462]: time="2026-04-24T23:36:17.603579231Z" level=info msg="CreateContainer within sandbox \"3b52be1368a9fdc639dba861000ce5d6b45d9bf6f4905c7e2b7c89ff0e5dc9b5\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"d5dacb6fead91be081f536d333d873467f69402fd614b7e7cf234d112a7a067a\"" Apr 24 23:36:17.606510 containerd[1462]: time="2026-04-24T23:36:17.606415106Z" level=info msg="StartContainer for \"d5dacb6fead91be081f536d333d873467f69402fd614b7e7cf234d112a7a067a\"" Apr 24 23:36:17.642453 systemd[1]: Started cri-containerd-d5dacb6fead91be081f536d333d873467f69402fd614b7e7cf234d112a7a067a.scope - libcontainer container d5dacb6fead91be081f536d333d873467f69402fd614b7e7cf234d112a7a067a. Apr 24 23:36:17.687141 containerd[1462]: time="2026-04-24T23:36:17.685697709Z" level=info msg="StartContainer for \"d5dacb6fead91be081f536d333d873467f69402fd614b7e7cf234d112a7a067a\" returns successfully" Apr 24 23:36:18.655138 kubelet[2581]: I0424 23:36:18.653992 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-drdkn" podStartSLOduration=20.026067553 podStartE2EDuration="38.653978954s" podCreationTimestamp="2026-04-24 23:35:40 +0000 UTC" firstStartedPulling="2026-04-24 23:35:56.84153447 +0000 UTC m=+36.885092786" lastFinishedPulling="2026-04-24 23:36:15.469445871 +0000 UTC m=+55.513004187" observedRunningTime="2026-04-24 23:36:16.638762838 +0000 UTC m=+56.682321194" watchObservedRunningTime="2026-04-24 23:36:18.653978954 +0000 UTC m=+58.697537270" Apr 24 23:36:20.105335 containerd[1462]: time="2026-04-24T23:36:20.105277892Z" level=info msg="StopPodSandbox for \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\"" Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.155 [WARNING][5271] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0", GenerateName:"calico-apiserver-ccf7f6874-", Namespace:"calico-system", SelfLink:"", UID:"97bad9ab-01a6-4cd6-89bf-ca7dc9310125", ResourceVersion:"1008", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ccf7f6874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a", Pod:"calico-apiserver-ccf7f6874-fxzbx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1595325ce6f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.156 [INFO][5271] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.156 [INFO][5271] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" iface="eth0" netns="" Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.156 [INFO][5271] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.156 [INFO][5271] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.184 [INFO][5280] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" HandleID="k8s-pod-network.aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.184 [INFO][5280] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.185 [INFO][5280] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.196 [WARNING][5280] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" HandleID="k8s-pod-network.aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.196 [INFO][5280] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" HandleID="k8s-pod-network.aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.199 [INFO][5280] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:20.205112 containerd[1462]: 2026-04-24 23:36:20.202 [INFO][5271] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:36:20.205636 containerd[1462]: time="2026-04-24T23:36:20.205281751Z" level=info msg="TearDown network for sandbox \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\" successfully" Apr 24 23:36:20.205636 containerd[1462]: time="2026-04-24T23:36:20.205385801Z" level=info msg="StopPodSandbox for \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\" returns successfully" Apr 24 23:36:20.206631 containerd[1462]: time="2026-04-24T23:36:20.206421497Z" level=info msg="RemovePodSandbox for \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\"" Apr 24 23:36:20.221089 containerd[1462]: time="2026-04-24T23:36:20.221036016Z" level=info msg="Forcibly stopping sandbox \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\"" Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.267 [WARNING][5295] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0", GenerateName:"calico-apiserver-ccf7f6874-", Namespace:"calico-system", SelfLink:"", UID:"97bad9ab-01a6-4cd6-89bf-ca7dc9310125", ResourceVersion:"1008", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ccf7f6874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"dc00f0a26827cfe71c3f0ffffe67d49ca3cc816a023700ee6573bc9e96ca342a", Pod:"calico-apiserver-ccf7f6874-fxzbx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali1595325ce6f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.269 [INFO][5295] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.270 [INFO][5295] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" iface="eth0" netns="" Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.270 [INFO][5295] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.270 [INFO][5295] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.291 [INFO][5302] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" HandleID="k8s-pod-network.aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.291 [INFO][5302] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.291 [INFO][5302] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.301 [WARNING][5302] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" HandleID="k8s-pod-network.aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.301 [INFO][5302] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" HandleID="k8s-pod-network.aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--fxzbx-eth0" Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.304 [INFO][5302] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:20.308910 containerd[1462]: 2026-04-24 23:36:20.305 [INFO][5295] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd" Apr 24 23:36:20.309873 containerd[1462]: time="2026-04-24T23:36:20.308927390Z" level=info msg="TearDown network for sandbox \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\" successfully" Apr 24 23:36:20.316296 containerd[1462]: time="2026-04-24T23:36:20.316034691Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:36:20.316296 containerd[1462]: time="2026-04-24T23:36:20.316164663Z" level=info msg="RemovePodSandbox \"aed60164462ff4aad3d1ccc2bb37b5c27a9225c8b42a7b6bae829862ba0d12bd\" returns successfully" Apr 24 23:36:20.317453 containerd[1462]: time="2026-04-24T23:36:20.317415699Z" level=info msg="StopPodSandbox for \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\"" Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.364 [WARNING][5316] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0", GenerateName:"calico-kube-controllers-54d98b476c-", Namespace:"calico-system", SelfLink:"", UID:"ad82956a-a9d5-482d-9feb-ead05df18123", ResourceVersion:"1045", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"54d98b476c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515", Pod:"calico-kube-controllers-54d98b476c-cxbz2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif2c4b987dac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.364 [INFO][5316] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.364 [INFO][5316] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" iface="eth0" netns="" Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.364 [INFO][5316] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.364 [INFO][5316] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.395 [INFO][5323] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" HandleID="k8s-pod-network.1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.395 [INFO][5323] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.395 [INFO][5323] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.411 [WARNING][5323] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" HandleID="k8s-pod-network.1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.411 [INFO][5323] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" HandleID="k8s-pod-network.1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.414 [INFO][5323] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:20.420391 containerd[1462]: 2026-04-24 23:36:20.417 [INFO][5316] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:36:20.422298 containerd[1462]: time="2026-04-24T23:36:20.420747828Z" level=info msg="TearDown network for sandbox \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\" successfully" Apr 24 23:36:20.422298 containerd[1462]: time="2026-04-24T23:36:20.420786912Z" level=info msg="StopPodSandbox for \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\" returns successfully" Apr 24 23:36:20.422939 containerd[1462]: time="2026-04-24T23:36:20.422526474Z" level=info msg="RemovePodSandbox for \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\"" Apr 24 23:36:20.422939 containerd[1462]: time="2026-04-24T23:36:20.422566197Z" level=info msg="Forcibly stopping sandbox \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\"" Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.472 [WARNING][5339] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0", GenerateName:"calico-kube-controllers-54d98b476c-", Namespace:"calico-system", SelfLink:"", UID:"ad82956a-a9d5-482d-9feb-ead05df18123", ResourceVersion:"1045", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"54d98b476c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"11dc9d012984b00dedbdb8b8115c8a21435603cc32586e9bca4e7e6597ccc515", Pod:"calico-kube-controllers-54d98b476c-cxbz2", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif2c4b987dac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.473 [INFO][5339] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.473 [INFO][5339] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" iface="eth0" netns="" Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.473 [INFO][5339] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.473 [INFO][5339] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.498 [INFO][5346] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" HandleID="k8s-pod-network.1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.498 [INFO][5346] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.498 [INFO][5346] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.509 [WARNING][5346] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" HandleID="k8s-pod-network.1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.509 [INFO][5346] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" HandleID="k8s-pod-network.1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--kube--controllers--54d98b476c--cxbz2-eth0" Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.511 [INFO][5346] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:20.516914 containerd[1462]: 2026-04-24 23:36:20.514 [INFO][5339] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23" Apr 24 23:36:20.517669 containerd[1462]: time="2026-04-24T23:36:20.517164035Z" level=info msg="TearDown network for sandbox \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\" successfully" Apr 24 23:36:20.523036 containerd[1462]: time="2026-04-24T23:36:20.522851363Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:36:20.523036 containerd[1462]: time="2026-04-24T23:36:20.522930771Z" level=info msg="RemovePodSandbox \"1554c7889caa25fe939ea72e64cd14f57148d1aafeb01bde86f7a6e9f10b0b23\" returns successfully" Apr 24 23:36:20.524000 containerd[1462]: time="2026-04-24T23:36:20.523671840Z" level=info msg="StopPodSandbox for \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\"" Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.571 [WARNING][5360] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"7e35c05e-9f62-47d8-9c32-7f4c38496c65", ResourceVersion:"983", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5", Pod:"coredns-7d764666f9-l2x8h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid82795e9477", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.572 [INFO][5360] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.572 [INFO][5360] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" iface="eth0" netns="" Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.572 [INFO][5360] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.572 [INFO][5360] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.593 [INFO][5367] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" HandleID="k8s-pod-network.7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.593 [INFO][5367] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.593 [INFO][5367] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.605 [WARNING][5367] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" HandleID="k8s-pod-network.7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.605 [INFO][5367] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" HandleID="k8s-pod-network.7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.608 [INFO][5367] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:20.612107 containerd[1462]: 2026-04-24 23:36:20.610 [INFO][5360] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:36:20.612107 containerd[1462]: time="2026-04-24T23:36:20.612161349Z" level=info msg="TearDown network for sandbox \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\" successfully" Apr 24 23:36:20.612995 containerd[1462]: time="2026-04-24T23:36:20.612188871Z" level=info msg="StopPodSandbox for \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\" returns successfully" Apr 24 23:36:20.612995 containerd[1462]: time="2026-04-24T23:36:20.612672276Z" level=info msg="RemovePodSandbox for \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\"" Apr 24 23:36:20.612995 containerd[1462]: time="2026-04-24T23:36:20.612703719Z" level=info msg="Forcibly stopping sandbox \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\"" Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.664 [WARNING][5382] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"7e35c05e-9f62-47d8-9c32-7f4c38496c65", ResourceVersion:"983", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"10e07a6b8d97ae98a8ef90a06e08705fb6c23025017443564e35296c6fe378f5", Pod:"coredns-7d764666f9-l2x8h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid82795e9477", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.665 [INFO][5382] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.665 [INFO][5382] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" iface="eth0" netns="" Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.665 [INFO][5382] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.665 [INFO][5382] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.686 [INFO][5389] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" HandleID="k8s-pod-network.7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.686 [INFO][5389] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.686 [INFO][5389] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.700 [WARNING][5389] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" HandleID="k8s-pod-network.7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.700 [INFO][5389] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" HandleID="k8s-pod-network.7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--l2x8h-eth0" Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.703 [INFO][5389] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:20.710340 containerd[1462]: 2026-04-24 23:36:20.706 [INFO][5382] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7" Apr 24 23:36:20.710340 containerd[1462]: time="2026-04-24T23:36:20.710289194Z" level=info msg="TearDown network for sandbox \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\" successfully" Apr 24 23:36:20.715429 containerd[1462]: time="2026-04-24T23:36:20.715364986Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:36:20.715958 containerd[1462]: time="2026-04-24T23:36:20.715453994Z" level=info msg="RemovePodSandbox \"7b47374f6436f5ea771b90d2ddfeb211e720a1d905e1bc8f12ca59bf7d9d97e7\" returns successfully" Apr 24 23:36:20.716525 containerd[1462]: time="2026-04-24T23:36:20.716297113Z" level=info msg="StopPodSandbox for \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\"" Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.762 [WARNING][5403] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"a67a5a23-28c6-44c6-8e44-b199e7391183", ResourceVersion:"1029", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18", Pod:"goldmane-9f7667bb8-9xhf6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali8d0e349e5bd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.763 [INFO][5403] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.763 [INFO][5403] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" iface="eth0" netns="" Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.763 [INFO][5403] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.763 [INFO][5403] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.785 [INFO][5410] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" HandleID="k8s-pod-network.9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.785 [INFO][5410] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.786 [INFO][5410] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.799 [WARNING][5410] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" HandleID="k8s-pod-network.9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.799 [INFO][5410] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" HandleID="k8s-pod-network.9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.802 [INFO][5410] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:20.806622 containerd[1462]: 2026-04-24 23:36:20.804 [INFO][5403] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:36:20.807425 containerd[1462]: time="2026-04-24T23:36:20.806675677Z" level=info msg="TearDown network for sandbox \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\" successfully" Apr 24 23:36:20.807425 containerd[1462]: time="2026-04-24T23:36:20.806702000Z" level=info msg="StopPodSandbox for \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\" returns successfully" Apr 24 23:36:20.807425 containerd[1462]: time="2026-04-24T23:36:20.807333859Z" level=info msg="RemovePodSandbox for \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\"" Apr 24 23:36:20.807500 containerd[1462]: time="2026-04-24T23:36:20.807364501Z" level=info msg="Forcibly stopping sandbox \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\"" Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.855 [WARNING][5424] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"a67a5a23-28c6-44c6-8e44-b199e7391183", ResourceVersion:"1029", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"9936c272c3f0e2fad8c377b8268b236c67137f997d8c8fdaddfc948873f50a18", Pod:"goldmane-9f7667bb8-9xhf6", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali8d0e349e5bd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.855 [INFO][5424] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.855 [INFO][5424] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" iface="eth0" netns="" Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.855 [INFO][5424] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.855 [INFO][5424] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.879 [INFO][5431] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" HandleID="k8s-pod-network.9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.879 [INFO][5431] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.880 [INFO][5431] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.897 [WARNING][5431] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" HandleID="k8s-pod-network.9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.897 [INFO][5431] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" HandleID="k8s-pod-network.9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Workload="ci--4081--3--6--n--0494d1f24d-k8s-goldmane--9f7667bb8--9xhf6-eth0" Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.899 [INFO][5431] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:20.904023 containerd[1462]: 2026-04-24 23:36:20.901 [INFO][5424] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d" Apr 24 23:36:20.904619 containerd[1462]: time="2026-04-24T23:36:20.904054813Z" level=info msg="TearDown network for sandbox \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\" successfully" Apr 24 23:36:20.909388 containerd[1462]: time="2026-04-24T23:36:20.909304781Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:36:20.909533 containerd[1462]: time="2026-04-24T23:36:20.909437514Z" level=info msg="RemovePodSandbox \"9b6dd7052ebeffaaba1376421670ace1fee06ecc6705ec848ee5684f1ca0855d\" returns successfully" Apr 24 23:36:20.910258 containerd[1462]: time="2026-04-24T23:36:20.910000486Z" level=info msg="StopPodSandbox for \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\"" Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.952 [WARNING][5445] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"54520bb9-f644-497d-8cfe-b9f4367c3def", ResourceVersion:"979", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041", Pod:"coredns-7d764666f9-lnj9t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3bb547fd16f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.952 [INFO][5445] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.952 [INFO][5445] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" iface="eth0" netns="" Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.952 [INFO][5445] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.952 [INFO][5445] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.976 [INFO][5452] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" HandleID="k8s-pod-network.de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.976 [INFO][5452] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.976 [INFO][5452] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.987 [WARNING][5452] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" HandleID="k8s-pod-network.de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.987 [INFO][5452] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" HandleID="k8s-pod-network.de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.989 [INFO][5452] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:20.994498 containerd[1462]: 2026-04-24 23:36:20.991 [INFO][5445] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:36:20.994498 containerd[1462]: time="2026-04-24T23:36:20.994108668Z" level=info msg="TearDown network for sandbox \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\" successfully" Apr 24 23:36:20.994498 containerd[1462]: time="2026-04-24T23:36:20.994170113Z" level=info msg="StopPodSandbox for \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\" returns successfully" Apr 24 23:36:20.996963 containerd[1462]: time="2026-04-24T23:36:20.994769209Z" level=info msg="RemovePodSandbox for \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\"" Apr 24 23:36:20.996963 containerd[1462]: time="2026-04-24T23:36:20.994814573Z" level=info msg="Forcibly stopping sandbox \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\"" Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.042 [WARNING][5468] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"54520bb9-f644-497d-8cfe-b9f4367c3def", ResourceVersion:"979", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"4b9b595289a74ac5990c9aa1faf29292f26dad2500dad23cf82351cf00d92041", Pod:"coredns-7d764666f9-lnj9t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3bb547fd16f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.044 [INFO][5468] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.044 [INFO][5468] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" iface="eth0" netns="" Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.044 [INFO][5468] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.044 [INFO][5468] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.068 [INFO][5475] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" HandleID="k8s-pod-network.de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.069 [INFO][5475] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.069 [INFO][5475] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.082 [WARNING][5475] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" HandleID="k8s-pod-network.de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.082 [INFO][5475] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" HandleID="k8s-pod-network.de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Workload="ci--4081--3--6--n--0494d1f24d-k8s-coredns--7d764666f9--lnj9t-eth0" Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.085 [INFO][5475] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:21.089819 containerd[1462]: 2026-04-24 23:36:21.088 [INFO][5468] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72" Apr 24 23:36:21.090456 containerd[1462]: time="2026-04-24T23:36:21.089840067Z" level=info msg="TearDown network for sandbox \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\" successfully" Apr 24 23:36:21.093644 containerd[1462]: time="2026-04-24T23:36:21.093582331Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:36:21.093809 containerd[1462]: time="2026-04-24T23:36:21.093739265Z" level=info msg="RemovePodSandbox \"de9656bfe97f83441058c328d0d39fed1ee166d4fbc015e3ce2c151595256b72\" returns successfully" Apr 24 23:36:21.094254 containerd[1462]: time="2026-04-24T23:36:21.094235951Z" level=info msg="StopPodSandbox for \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\"" Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.142 [WARNING][5489] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0", GenerateName:"calico-apiserver-ccf7f6874-", Namespace:"calico-system", SelfLink:"", UID:"6dca57dd-795d-4761-9e86-5a637e03b75c", ResourceVersion:"1014", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ccf7f6874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292", Pod:"calico-apiserver-ccf7f6874-6drmn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali104746d9e98", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.142 [INFO][5489] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.142 [INFO][5489] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" iface="eth0" netns="" Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.143 [INFO][5489] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.143 [INFO][5489] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.167 [INFO][5496] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" HandleID="k8s-pod-network.b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.167 [INFO][5496] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.168 [INFO][5496] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.182 [WARNING][5496] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" HandleID="k8s-pod-network.b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.182 [INFO][5496] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" HandleID="k8s-pod-network.b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.185 [INFO][5496] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:21.190285 containerd[1462]: 2026-04-24 23:36:21.187 [INFO][5489] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:36:21.190285 containerd[1462]: time="2026-04-24T23:36:21.190249970Z" level=info msg="TearDown network for sandbox \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\" successfully" Apr 24 23:36:21.190285 containerd[1462]: time="2026-04-24T23:36:21.190277892Z" level=info msg="StopPodSandbox for \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\" returns successfully" Apr 24 23:36:21.194432 containerd[1462]: time="2026-04-24T23:36:21.191415797Z" level=info msg="RemovePodSandbox for \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\"" Apr 24 23:36:21.194432 containerd[1462]: time="2026-04-24T23:36:21.191480483Z" level=info msg="Forcibly stopping sandbox \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\"" Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.240 [WARNING][5510] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0", GenerateName:"calico-apiserver-ccf7f6874-", Namespace:"calico-system", SelfLink:"", UID:"6dca57dd-795d-4761-9e86-5a637e03b75c", ResourceVersion:"1014", Generation:0, CreationTimestamp:time.Date(2026, time.April, 24, 23, 35, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"ccf7f6874", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-0494d1f24d", ContainerID:"eac367f59f2b92f92f741540bd5453ffad23e2c97f71da941b239662f1b93292", Pod:"calico-apiserver-ccf7f6874-6drmn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali104746d9e98", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.240 [INFO][5510] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.240 [INFO][5510] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" iface="eth0" netns="" Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.240 [INFO][5510] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.240 [INFO][5510] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.263 [INFO][5517] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" HandleID="k8s-pod-network.b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.263 [INFO][5517] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.263 [INFO][5517] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.276 [WARNING][5517] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" HandleID="k8s-pod-network.b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.276 [INFO][5517] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" HandleID="k8s-pod-network.b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Workload="ci--4081--3--6--n--0494d1f24d-k8s-calico--apiserver--ccf7f6874--6drmn-eth0" Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.279 [INFO][5517] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:21.283228 containerd[1462]: 2026-04-24 23:36:21.281 [INFO][5510] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef" Apr 24 23:36:21.283228 containerd[1462]: time="2026-04-24T23:36:21.283204988Z" level=info msg="TearDown network for sandbox \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\" successfully" Apr 24 23:36:21.287706 containerd[1462]: time="2026-04-24T23:36:21.287613873Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:36:21.287844 containerd[1462]: time="2026-04-24T23:36:21.287779688Z" level=info msg="RemovePodSandbox \"b6969e3808d65953a8a46a1c82f16b6fe7a046267abad766b3fa13d9d36f41ef\" returns successfully" Apr 24 23:36:21.288566 containerd[1462]: time="2026-04-24T23:36:21.288516636Z" level=info msg="StopPodSandbox for \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\"" Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.335 [WARNING][5531] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7d4c795b57--vskxf-eth0" Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.335 [INFO][5531] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.335 [INFO][5531] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" iface="eth0" netns="" Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.336 [INFO][5531] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.336 [INFO][5531] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.363 [INFO][5538] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" HandleID="k8s-pod-network.a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7d4c795b57--vskxf-eth0" Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.363 [INFO][5538] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.363 [INFO][5538] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.376 [WARNING][5538] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" HandleID="k8s-pod-network.a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7d4c795b57--vskxf-eth0" Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.376 [INFO][5538] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" HandleID="k8s-pod-network.a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7d4c795b57--vskxf-eth0" Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.379 [INFO][5538] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:21.382598 containerd[1462]: 2026-04-24 23:36:21.381 [INFO][5531] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:36:21.383102 containerd[1462]: time="2026-04-24T23:36:21.382640241Z" level=info msg="TearDown network for sandbox \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\" successfully" Apr 24 23:36:21.383102 containerd[1462]: time="2026-04-24T23:36:21.382667043Z" level=info msg="StopPodSandbox for \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\" returns successfully" Apr 24 23:36:21.383432 containerd[1462]: time="2026-04-24T23:36:21.383219294Z" level=info msg="RemovePodSandbox for \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\"" Apr 24 23:36:21.383432 containerd[1462]: time="2026-04-24T23:36:21.383263138Z" level=info msg="Forcibly stopping sandbox \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\"" Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.429 [WARNING][5552] cni-plugin/k8s.go 610: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" WorkloadEndpoint="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7d4c795b57--vskxf-eth0" Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.429 [INFO][5552] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.429 [INFO][5552] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" iface="eth0" netns="" Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.429 [INFO][5552] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.429 [INFO][5552] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.460 [INFO][5559] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" HandleID="k8s-pod-network.a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7d4c795b57--vskxf-eth0" Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.460 [INFO][5559] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.460 [INFO][5559] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.474 [WARNING][5559] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" HandleID="k8s-pod-network.a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7d4c795b57--vskxf-eth0" Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.474 [INFO][5559] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" HandleID="k8s-pod-network.a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Workload="ci--4081--3--6--n--0494d1f24d-k8s-whisker--7d4c795b57--vskxf-eth0" Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.476 [INFO][5559] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Apr 24 23:36:21.483667 containerd[1462]: 2026-04-24 23:36:21.479 [INFO][5552] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798" Apr 24 23:36:21.483667 containerd[1462]: time="2026-04-24T23:36:21.483502265Z" level=info msg="TearDown network for sandbox \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\" successfully" Apr 24 23:36:21.488315 containerd[1462]: time="2026-04-24T23:36:21.488188455Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Apr 24 23:36:21.488415 containerd[1462]: time="2026-04-24T23:36:21.488370832Z" level=info msg="RemovePodSandbox \"a90bb52d53ba8c8ddab1d42c96b03c66cacab89a862d8f7f06d55932dea64798\" returns successfully" Apr 24 23:36:29.216665 kubelet[2581]: I0424 23:36:29.216363 2581 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-7bf446958b-2nzh8" podStartSLOduration=13.395517388 podStartE2EDuration="32.216221977s" podCreationTimestamp="2026-04-24 23:35:57 +0000 UTC" firstStartedPulling="2026-04-24 23:35:58.752164546 +0000 UTC m=+38.795722982" lastFinishedPulling="2026-04-24 23:36:17.572869255 +0000 UTC m=+57.616427571" observedRunningTime="2026-04-24 23:36:18.654734626 +0000 UTC m=+58.698292942" watchObservedRunningTime="2026-04-24 23:36:29.216221977 +0000 UTC m=+69.259780293" Apr 24 23:36:30.354864 kubelet[2581]: I0424 23:36:30.354723 2581 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:36:43.393308 kubelet[2581]: I0424 23:36:43.393136 2581 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Apr 24 23:37:44.099545 systemd[1]: Started sshd@7-178.105.28.58:22-50.85.169.122:59918.service - OpenSSH per-connection server daemon (50.85.169.122:59918). Apr 24 23:37:44.234594 sshd[5895]: Accepted publickey for core from 50.85.169.122 port 59918 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:37:44.237377 sshd[5895]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:37:44.243486 systemd-logind[1455]: New session 8 of user core. Apr 24 23:37:44.251524 systemd[1]: Started session-8.scope - Session 8 of User core. Apr 24 23:37:44.452753 sshd[5895]: pam_unix(sshd:session): session closed for user core Apr 24 23:37:44.460764 systemd[1]: sshd@7-178.105.28.58:22-50.85.169.122:59918.service: Deactivated successfully. Apr 24 23:37:44.464208 systemd[1]: session-8.scope: Deactivated successfully. Apr 24 23:37:44.465488 systemd-logind[1455]: Session 8 logged out. Waiting for processes to exit. Apr 24 23:37:44.467488 systemd-logind[1455]: Removed session 8. Apr 24 23:37:49.491588 systemd[1]: Started sshd@8-178.105.28.58:22-50.85.169.122:43062.service - OpenSSH per-connection server daemon (50.85.169.122:43062). Apr 24 23:37:49.615435 sshd[5910]: Accepted publickey for core from 50.85.169.122 port 43062 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:37:49.617699 sshd[5910]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:37:49.622536 systemd-logind[1455]: New session 9 of user core. Apr 24 23:37:49.632442 systemd[1]: Started session-9.scope - Session 9 of User core. Apr 24 23:37:49.838200 sshd[5910]: pam_unix(sshd:session): session closed for user core Apr 24 23:37:49.844600 systemd[1]: sshd@8-178.105.28.58:22-50.85.169.122:43062.service: Deactivated successfully. Apr 24 23:37:49.849775 systemd[1]: session-9.scope: Deactivated successfully. Apr 24 23:37:49.851587 systemd-logind[1455]: Session 9 logged out. Waiting for processes to exit. Apr 24 23:37:49.855669 systemd-logind[1455]: Removed session 9. Apr 24 23:37:54.881458 systemd[1]: Started sshd@9-178.105.28.58:22-50.85.169.122:43070.service - OpenSSH per-connection server daemon (50.85.169.122:43070). Apr 24 23:37:55.014161 sshd[5924]: Accepted publickey for core from 50.85.169.122 port 43070 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:37:55.015436 sshd[5924]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:37:55.021677 systemd-logind[1455]: New session 10 of user core. Apr 24 23:37:55.030497 systemd[1]: Started session-10.scope - Session 10 of User core. Apr 24 23:37:55.216660 sshd[5924]: pam_unix(sshd:session): session closed for user core Apr 24 23:37:55.223290 systemd-logind[1455]: Session 10 logged out. Waiting for processes to exit. Apr 24 23:37:55.223458 systemd[1]: sshd@9-178.105.28.58:22-50.85.169.122:43070.service: Deactivated successfully. Apr 24 23:37:55.225570 systemd[1]: session-10.scope: Deactivated successfully. Apr 24 23:37:55.228649 systemd-logind[1455]: Removed session 10. Apr 24 23:38:00.247690 systemd[1]: Started sshd@10-178.105.28.58:22-50.85.169.122:33828.service - OpenSSH per-connection server daemon (50.85.169.122:33828). Apr 24 23:38:00.400490 sshd[5981]: Accepted publickey for core from 50.85.169.122 port 33828 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:38:00.402590 sshd[5981]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:00.411372 systemd-logind[1455]: New session 11 of user core. Apr 24 23:38:00.416627 systemd[1]: Started session-11.scope - Session 11 of User core. Apr 24 23:38:00.614658 sshd[5981]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:00.620151 systemd[1]: sshd@10-178.105.28.58:22-50.85.169.122:33828.service: Deactivated successfully. Apr 24 23:38:00.623212 systemd[1]: session-11.scope: Deactivated successfully. Apr 24 23:38:00.625173 systemd-logind[1455]: Session 11 logged out. Waiting for processes to exit. Apr 24 23:38:00.627529 systemd-logind[1455]: Removed session 11. Apr 24 23:38:00.646516 systemd[1]: Started sshd@11-178.105.28.58:22-50.85.169.122:33836.service - OpenSSH per-connection server daemon (50.85.169.122:33836). Apr 24 23:38:00.774393 sshd[5994]: Accepted publickey for core from 50.85.169.122 port 33836 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:38:00.777989 sshd[5994]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:00.785278 systemd-logind[1455]: New session 12 of user core. Apr 24 23:38:00.788438 systemd[1]: Started session-12.scope - Session 12 of User core. Apr 24 23:38:01.033546 sshd[5994]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:01.038934 systemd-logind[1455]: Session 12 logged out. Waiting for processes to exit. Apr 24 23:38:01.039135 systemd[1]: sshd@11-178.105.28.58:22-50.85.169.122:33836.service: Deactivated successfully. Apr 24 23:38:01.043338 systemd[1]: session-12.scope: Deactivated successfully. Apr 24 23:38:01.059845 systemd-logind[1455]: Removed session 12. Apr 24 23:38:01.067500 systemd[1]: Started sshd@12-178.105.28.58:22-50.85.169.122:33838.service - OpenSSH per-connection server daemon (50.85.169.122:33838). Apr 24 23:38:01.201165 sshd[6005]: Accepted publickey for core from 50.85.169.122 port 33838 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:38:01.203177 sshd[6005]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:01.211607 systemd-logind[1455]: New session 13 of user core. Apr 24 23:38:01.218385 systemd[1]: Started session-13.scope - Session 13 of User core. Apr 24 23:38:01.402866 sshd[6005]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:01.409013 systemd[1]: sshd@12-178.105.28.58:22-50.85.169.122:33838.service: Deactivated successfully. Apr 24 23:38:01.411852 systemd[1]: session-13.scope: Deactivated successfully. Apr 24 23:38:01.416974 systemd-logind[1455]: Session 13 logged out. Waiting for processes to exit. Apr 24 23:38:01.420666 systemd-logind[1455]: Removed session 13. Apr 24 23:38:06.437715 systemd[1]: Started sshd@13-178.105.28.58:22-50.85.169.122:33848.service - OpenSSH per-connection server daemon (50.85.169.122:33848). Apr 24 23:38:06.554632 sshd[6018]: Accepted publickey for core from 50.85.169.122 port 33848 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:38:06.557244 sshd[6018]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:06.563487 systemd-logind[1455]: New session 14 of user core. Apr 24 23:38:06.568632 systemd[1]: Started session-14.scope - Session 14 of User core. Apr 24 23:38:06.758227 sshd[6018]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:06.764584 systemd-logind[1455]: Session 14 logged out. Waiting for processes to exit. Apr 24 23:38:06.764661 systemd[1]: sshd@13-178.105.28.58:22-50.85.169.122:33848.service: Deactivated successfully. Apr 24 23:38:06.772196 systemd[1]: session-14.scope: Deactivated successfully. Apr 24 23:38:06.790625 systemd-logind[1455]: Removed session 14. Apr 24 23:38:06.799675 systemd[1]: Started sshd@14-178.105.28.58:22-50.85.169.122:33850.service - OpenSSH per-connection server daemon (50.85.169.122:33850). Apr 24 23:38:06.935173 sshd[6031]: Accepted publickey for core from 50.85.169.122 port 33850 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:38:06.936732 sshd[6031]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:06.942765 systemd-logind[1455]: New session 15 of user core. Apr 24 23:38:06.953542 systemd[1]: Started session-15.scope - Session 15 of User core. Apr 24 23:38:07.327838 sshd[6031]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:07.336152 systemd-logind[1455]: Session 15 logged out. Waiting for processes to exit. Apr 24 23:38:07.336628 systemd[1]: sshd@14-178.105.28.58:22-50.85.169.122:33850.service: Deactivated successfully. Apr 24 23:38:07.340840 systemd[1]: session-15.scope: Deactivated successfully. Apr 24 23:38:07.350194 systemd-logind[1455]: Removed session 15. Apr 24 23:38:07.358619 systemd[1]: Started sshd@15-178.105.28.58:22-50.85.169.122:33858.service - OpenSSH per-connection server daemon (50.85.169.122:33858). Apr 24 23:38:07.490621 sshd[6042]: Accepted publickey for core from 50.85.169.122 port 33858 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:38:07.492220 sshd[6042]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:07.499051 systemd-logind[1455]: New session 16 of user core. Apr 24 23:38:07.505426 systemd[1]: Started session-16.scope - Session 16 of User core. Apr 24 23:38:08.218709 sshd[6042]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:08.226710 systemd[1]: sshd@15-178.105.28.58:22-50.85.169.122:33858.service: Deactivated successfully. Apr 24 23:38:08.232241 systemd[1]: session-16.scope: Deactivated successfully. Apr 24 23:38:08.236195 systemd-logind[1455]: Session 16 logged out. Waiting for processes to exit. Apr 24 23:38:08.262514 systemd[1]: Started sshd@16-178.105.28.58:22-50.85.169.122:33874.service - OpenSSH per-connection server daemon (50.85.169.122:33874). Apr 24 23:38:08.264039 systemd-logind[1455]: Removed session 16. Apr 24 23:38:08.389790 sshd[6066]: Accepted publickey for core from 50.85.169.122 port 33874 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:38:08.391506 sshd[6066]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:08.400401 systemd-logind[1455]: New session 17 of user core. Apr 24 23:38:08.408424 systemd[1]: Started session-17.scope - Session 17 of User core. Apr 24 23:38:08.739404 sshd[6066]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:08.744478 systemd-logind[1455]: Session 17 logged out. Waiting for processes to exit. Apr 24 23:38:08.744940 systemd[1]: sshd@16-178.105.28.58:22-50.85.169.122:33874.service: Deactivated successfully. Apr 24 23:38:08.749066 systemd[1]: session-17.scope: Deactivated successfully. Apr 24 23:38:08.753223 systemd-logind[1455]: Removed session 17. Apr 24 23:38:08.769737 systemd[1]: Started sshd@17-178.105.28.58:22-50.85.169.122:33886.service - OpenSSH per-connection server daemon (50.85.169.122:33886). Apr 24 23:38:08.900907 sshd[6077]: Accepted publickey for core from 50.85.169.122 port 33886 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:38:08.903555 sshd[6077]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:08.911679 systemd-logind[1455]: New session 18 of user core. Apr 24 23:38:08.916375 systemd[1]: Started session-18.scope - Session 18 of User core. Apr 24 23:38:09.106837 sshd[6077]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:09.115555 systemd[1]: sshd@17-178.105.28.58:22-50.85.169.122:33886.service: Deactivated successfully. Apr 24 23:38:09.121131 systemd[1]: session-18.scope: Deactivated successfully. Apr 24 23:38:09.123404 systemd-logind[1455]: Session 18 logged out. Waiting for processes to exit. Apr 24 23:38:09.124946 systemd-logind[1455]: Removed session 18. Apr 24 23:38:12.606452 systemd[1]: run-containerd-runc-k8s.io-d7d9f0e2a18cca3a930a1e7caa3426ddbfa23bdd90214992aeebe2869de54044-runc.wSVMkv.mount: Deactivated successfully. Apr 24 23:38:14.137657 systemd[1]: Started sshd@18-178.105.28.58:22-50.85.169.122:34236.service - OpenSSH per-connection server daemon (50.85.169.122:34236). Apr 24 23:38:14.255668 sshd[6134]: Accepted publickey for core from 50.85.169.122 port 34236 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:38:14.259436 sshd[6134]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:14.265577 systemd-logind[1455]: New session 19 of user core. Apr 24 23:38:14.274482 systemd[1]: Started session-19.scope - Session 19 of User core. Apr 24 23:38:14.449721 sshd[6134]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:14.456574 systemd-logind[1455]: Session 19 logged out. Waiting for processes to exit. Apr 24 23:38:14.456990 systemd[1]: sshd@18-178.105.28.58:22-50.85.169.122:34236.service: Deactivated successfully. Apr 24 23:38:14.459655 systemd[1]: session-19.scope: Deactivated successfully. Apr 24 23:38:14.463498 systemd-logind[1455]: Removed session 19. Apr 24 23:38:19.488605 systemd[1]: Started sshd@19-178.105.28.58:22-50.85.169.122:52730.service - OpenSSH per-connection server daemon (50.85.169.122:52730). Apr 24 23:38:19.615776 sshd[6147]: Accepted publickey for core from 50.85.169.122 port 52730 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:38:19.618044 sshd[6147]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:19.623910 systemd-logind[1455]: New session 20 of user core. Apr 24 23:38:19.630508 systemd[1]: Started session-20.scope - Session 20 of User core. Apr 24 23:38:19.814569 sshd[6147]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:19.821932 systemd[1]: sshd@19-178.105.28.58:22-50.85.169.122:52730.service: Deactivated successfully. Apr 24 23:38:19.826032 systemd[1]: session-20.scope: Deactivated successfully. Apr 24 23:38:19.827246 systemd-logind[1455]: Session 20 logged out. Waiting for processes to exit. Apr 24 23:38:19.828689 systemd-logind[1455]: Removed session 20. Apr 24 23:38:24.849585 systemd[1]: Started sshd@20-178.105.28.58:22-50.85.169.122:52744.service - OpenSSH per-connection server daemon (50.85.169.122:52744). Apr 24 23:38:24.979026 sshd[6181]: Accepted publickey for core from 50.85.169.122 port 52744 ssh2: RSA SHA256:LBBtzjDnLNZXcnA2s4HQvRsYKvtzAwhHM1R/Z1hMC7M Apr 24 23:38:24.982337 sshd[6181]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Apr 24 23:38:24.993194 systemd-logind[1455]: New session 21 of user core. Apr 24 23:38:24.999585 systemd[1]: Started session-21.scope - Session 21 of User core. Apr 24 23:38:25.224812 sshd[6181]: pam_unix(sshd:session): session closed for user core Apr 24 23:38:25.230498 systemd[1]: sshd@20-178.105.28.58:22-50.85.169.122:52744.service: Deactivated successfully. Apr 24 23:38:25.234470 systemd[1]: session-21.scope: Deactivated successfully. Apr 24 23:38:25.235672 systemd-logind[1455]: Session 21 logged out. Waiting for processes to exit. Apr 24 23:38:25.237178 systemd-logind[1455]: Removed session 21. Apr 24 23:38:29.124831 systemd[1]: run-containerd-runc-k8s.io-64365ef6c453321921fee54c1badc759e4738c4a10e94e6e0b6985e1b09095c9-runc.f0N2Zg.mount: Deactivated successfully. Apr 24 23:38:39.431376 systemd[1]: cri-containerd-c32641b7cc6f4e7152353812bcc8d1582cb2ebc595f4c6be352adb53b8143fa0.scope: Deactivated successfully. Apr 24 23:38:39.431649 systemd[1]: cri-containerd-c32641b7cc6f4e7152353812bcc8d1582cb2ebc595f4c6be352adb53b8143fa0.scope: Consumed 16.611s CPU time. Apr 24 23:38:39.464891 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c32641b7cc6f4e7152353812bcc8d1582cb2ebc595f4c6be352adb53b8143fa0-rootfs.mount: Deactivated successfully. Apr 24 23:38:39.466137 containerd[1462]: time="2026-04-24T23:38:39.465816645Z" level=info msg="shim disconnected" id=c32641b7cc6f4e7152353812bcc8d1582cb2ebc595f4c6be352adb53b8143fa0 namespace=k8s.io Apr 24 23:38:39.467334 containerd[1462]: time="2026-04-24T23:38:39.466900063Z" level=warning msg="cleaning up after shim disconnected" id=c32641b7cc6f4e7152353812bcc8d1582cb2ebc595f4c6be352adb53b8143fa0 namespace=k8s.io Apr 24 23:38:39.467334 containerd[1462]: time="2026-04-24T23:38:39.466942181Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:38:39.838167 kubelet[2581]: E0424 23:38:39.836324 2581 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:45872->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4081-3-6-n-0494d1f24d.18a96f4f312437f3 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4081-3-6-n-0494d1f24d,UID:d1482e8f637c7e8785ce3d7f8596b313,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-0494d1f24d,},FirstTimestamp:2026-04-24 23:38:33.891928051 +0000 UTC m=+193.935486367,LastTimestamp:2026-04-24 23:38:33.891928051 +0000 UTC m=+193.935486367,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-0494d1f24d,}" Apr 24 23:38:39.889683 systemd[1]: cri-containerd-e0c59f5b1a040bf8a99331654be53d146aa3db40a01642ddcbff39d35b40c31f.scope: Deactivated successfully. Apr 24 23:38:39.890395 systemd[1]: cri-containerd-e0c59f5b1a040bf8a99331654be53d146aa3db40a01642ddcbff39d35b40c31f.scope: Consumed 3.747s CPU time, 22.3M memory peak, 0B memory swap peak. Apr 24 23:38:39.902225 kubelet[2581]: E0424 23:38:39.901570 2581 controller.go:251] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:46256->10.0.0.2:2379: read: connection timed out" Apr 24 23:38:39.923925 containerd[1462]: time="2026-04-24T23:38:39.923863309Z" level=info msg="shim disconnected" id=e0c59f5b1a040bf8a99331654be53d146aa3db40a01642ddcbff39d35b40c31f namespace=k8s.io Apr 24 23:38:39.923925 containerd[1462]: time="2026-04-24T23:38:39.923916946Z" level=warning msg="cleaning up after shim disconnected" id=e0c59f5b1a040bf8a99331654be53d146aa3db40a01642ddcbff39d35b40c31f namespace=k8s.io Apr 24 23:38:39.923925 containerd[1462]: time="2026-04-24T23:38:39.923925026Z" level=info msg="cleaning up dead shim" namespace=k8s.io Apr 24 23:38:39.924950 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e0c59f5b1a040bf8a99331654be53d146aa3db40a01642ddcbff39d35b40c31f-rootfs.mount: Deactivated successfully. Apr 24 23:38:40.151807 kubelet[2581]: I0424 23:38:40.151703 2581 scope.go:122] "RemoveContainer" containerID="e0c59f5b1a040bf8a99331654be53d146aa3db40a01642ddcbff39d35b40c31f" Apr 24 23:38:40.156007 containerd[1462]: time="2026-04-24T23:38:40.155955293Z" level=info msg="CreateContainer within sandbox \"62f09dd1fde48adaaf896840594f640c0ecdead010dd86279368827653b2db18\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Apr 24 23:38:40.157447 kubelet[2581]: I0424 23:38:40.157410 2581 scope.go:122] "RemoveContainer" containerID="c32641b7cc6f4e7152353812bcc8d1582cb2ebc595f4c6be352adb53b8143fa0" Apr 24 23:38:40.160598 containerd[1462]: time="2026-04-24T23:38:40.160562956Z" level=info msg="CreateContainer within sandbox \"d88b8268c4aa927f5ca80156085725dbd5a6aa0ae8b3c1b91dc49a08a36a8b7a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Apr 24 23:38:40.173960 containerd[1462]: time="2026-04-24T23:38:40.173901853Z" level=info msg="CreateContainer within sandbox \"62f09dd1fde48adaaf896840594f640c0ecdead010dd86279368827653b2db18\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"b2239629333b8abbe1e97fc15359c4129a3d3698ea9e0ea27727b94588cb30ed\"" Apr 24 23:38:40.176853 containerd[1462]: time="2026-04-24T23:38:40.176479909Z" level=info msg="StartContainer for \"b2239629333b8abbe1e97fc15359c4129a3d3698ea9e0ea27727b94588cb30ed\"" Apr 24 23:38:40.184712 containerd[1462]: time="2026-04-24T23:38:40.184659733Z" level=info msg="CreateContainer within sandbox \"d88b8268c4aa927f5ca80156085725dbd5a6aa0ae8b3c1b91dc49a08a36a8b7a\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"d613581112ad9c546dfdcd013d305500ff54d5529f1d8135e4e5f2b90ff5faea\"" Apr 24 23:38:40.185666 containerd[1462]: time="2026-04-24T23:38:40.185626439Z" level=info msg="StartContainer for \"d613581112ad9c546dfdcd013d305500ff54d5529f1d8135e4e5f2b90ff5faea\"" Apr 24 23:38:40.213347 systemd[1]: Started cri-containerd-b2239629333b8abbe1e97fc15359c4129a3d3698ea9e0ea27727b94588cb30ed.scope - libcontainer container b2239629333b8abbe1e97fc15359c4129a3d3698ea9e0ea27727b94588cb30ed. Apr 24 23:38:40.218076 systemd[1]: Started cri-containerd-d613581112ad9c546dfdcd013d305500ff54d5529f1d8135e4e5f2b90ff5faea.scope - libcontainer container d613581112ad9c546dfdcd013d305500ff54d5529f1d8135e4e5f2b90ff5faea. Apr 24 23:38:40.266993 containerd[1462]: time="2026-04-24T23:38:40.266861593Z" level=info msg="StartContainer for \"d613581112ad9c546dfdcd013d305500ff54d5529f1d8135e4e5f2b90ff5faea\" returns successfully" Apr 24 23:38:40.273538 containerd[1462]: time="2026-04-24T23:38:40.273423787Z" level=info msg="StartContainer for \"b2239629333b8abbe1e97fc15359c4129a3d3698ea9e0ea27727b94588cb30ed\" returns successfully" Apr 24 23:38:40.600261 systemd[1]: run-containerd-runc-k8s.io-8aa9837b29bff3121442a897251883688fc548f6f9ee0a71922eb88556d5a3f1-runc.Uf1ox9.mount: Deactivated successfully.